Diff of the two buildlogs: -- --- b1/build.log 2023-04-22 05:40:50.601606270 +0000 +++ b2/build.log 2023-04-22 05:53:02.661144670 +0000 @@ -1,6 +1,6 @@ I: pbuilder: network access will be disabled during build -I: Current time: Thu May 23 23:47:22 -12 2024 -I: pbuilder-time-stamp: 1716551242 +I: Current time: Sat Apr 22 19:40:52 +14 2023 +I: pbuilder-time-stamp: 1682142052 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/bookworm-reproducible-base.tgz] I: copying local configuration @@ -16,7 +16,7 @@ I: copying [./locust_2.12.1.orig.tar.xz] I: copying [./locust_2.12.1-1.debian.tar.xz] I: Extracting source -gpgv: Signature made Fri Dec 23 18:00:51 2022 -12 +gpgv: Signature made Sat Dec 24 20:00:51 2022 +14 gpgv: using RSA key B9FAD3192AF3E4A5309D9D39879F3C993801A94F gpgv: Can't check signature: No public key dpkg-source: warning: cannot verify inline signature for ./locust_2.12.1-1.dsc: no acceptable signature found @@ -27,135 +27,167 @@ dpkg-source: info: applying doc-no-cli-output-or-sphinx-search.patch I: using fakeroot in build. I: Installing the build-deps -I: user script /srv/workspace/pbuilder/178780/tmp/hooks/D02_print_environment starting +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/D01_modify_environment starting +debug: Running on ionos1-amd64. +I: Changing host+domainname to test build reproducibility +I: Adding a custom variable just for the fun of it... +I: Changing /bin/sh to bash +'/bin/sh' -> '/bin/bash' +lrwxrwxrwx 1 root root 9 Apr 22 19:41 /bin/sh -> /bin/bash +I: Setting pbuilder2's login shell to /bin/bash +I: Setting pbuilder2's GECOS to second user,second room,second work-phone,second home-phone,second other +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/D01_modify_environment finished +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/D02_print_environment starting I: set - BUILDDIR='/build' - BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' - BUILDUSERNAME='pbuilder1' - BUILD_ARCH='amd64' - DEBIAN_FRONTEND='noninteractive' - DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=16' - DISTRIBUTION='bookworm' - HOME='/root' - HOST_ARCH='amd64' + BASH=/bin/sh + BASHOPTS=checkwinsize:cmdhist:complete_fullquote:extquote:force_fignore:globasciiranges:globskipdots:hostcomplete:interactive_comments:patsub_replacement:progcomp:promptvars:sourcepath + BASH_ALIASES=() + BASH_ARGC=() + BASH_ARGV=() + BASH_CMDS=() + BASH_LINENO=([0]="12" [1]="0") + BASH_LOADABLES_PATH=/usr/local/lib/bash:/usr/lib/bash:/opt/local/lib/bash:/usr/pkg/lib/bash:/opt/pkg/lib/bash:. + BASH_SOURCE=([0]="/tmp/hooks/D02_print_environment" [1]="/tmp/hooks/D02_print_environment") + BASH_VERSINFO=([0]="5" [1]="2" [2]="15" [3]="1" [4]="release" [5]="x86_64-pc-linux-gnu") + BASH_VERSION='5.2.15(1)-release' + BUILDDIR=/build + BUILDUSERGECOS='second user,second room,second work-phone,second home-phone,second other' + BUILDUSERNAME=pbuilder2 + BUILD_ARCH=amd64 + DEBIAN_FRONTEND=noninteractive + DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=15' + DIRSTACK=() + DISTRIBUTION=bookworm + EUID=0 + FUNCNAME=([0]="Echo" [1]="main") + GROUPS=() + HOME=/root + HOSTNAME=i-capture-the-hostname + HOSTTYPE=x86_64 + HOST_ARCH=amd64 IFS=' ' - INVOCATION_ID='2759a2bcdce74a738b216ddfcab8c3d8' - LANG='C' - LANGUAGE='en_US:en' - LC_ALL='C' - MAIL='/var/mail/root' - OPTIND='1' - PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' - PBCURRENTCOMMANDLINEOPERATION='build' - PBUILDER_OPERATION='build' - PBUILDER_PKGDATADIR='/usr/share/pbuilder' - PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' - PBUILDER_SYSCONFDIR='/etc' - PPID='178780' - PS1='# ' - PS2='> ' + INVOCATION_ID=a01a51c230494c4496d3d15ae9b4e723 + LANG=C + LANGUAGE=et_EE:et + LC_ALL=C + MACHTYPE=x86_64-pc-linux-gnu + MAIL=/var/mail/root + OPTERR=1 + OPTIND=1 + OSTYPE=linux-gnu + PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path + PBCURRENTCOMMANDLINEOPERATION=build + PBUILDER_OPERATION=build + PBUILDER_PKGDATADIR=/usr/share/pbuilder + PBUILDER_PKGLIBDIR=/usr/lib/pbuilder + PBUILDER_SYSCONFDIR=/etc + PIPESTATUS=([0]="0") + POSIXLY_CORRECT=y + PPID=3216951 PS4='+ ' - PWD='/' - SHELL='/bin/bash' - SHLVL='2' - SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.IrjmrTS1/pbuilderrc_X28S --distribution bookworm --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/bookworm-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.IrjmrTS1/b1 --logfile b1/build.log locust_2.12.1-1.dsc' - SUDO_GID='111' - SUDO_UID='106' - SUDO_USER='jenkins' - TERM='unknown' - TZ='/usr/share/zoneinfo/Etc/GMT+12' - USER='root' - _='/usr/bin/systemd-run' - http_proxy='http://85.184.249.68:3128' + PWD=/ + SHELL=/bin/bash + SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix + SHLVL=3 + SUDO_COMMAND='/usr/bin/timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.IrjmrTS1/pbuilderrc_aQ3o --distribution bookworm --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/bookworm-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.IrjmrTS1/b2 --logfile b2/build.log --extrapackages usrmerge locust_2.12.1-1.dsc' + SUDO_GID=110 + SUDO_UID=105 + SUDO_USER=jenkins + TERM=unknown + TZ=/usr/share/zoneinfo/Etc/GMT-14 + UID=0 + USER=root + _='I: set' + http_proxy=http://78.137.99.97:3128 I: uname -a - Linux ionos15-amd64 6.1.0-0.deb11.5-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.1.12-1~bpo11+1 (2023-03-05) x86_64 GNU/Linux + Linux i-capture-the-hostname 5.10.0-21-amd64 #1 SMP Debian 5.10.162-1 (2023-01-21) x86_64 GNU/Linux I: ls -l /bin total 5632 - -rwxr-xr-x 1 root root 1265648 Feb 12 2023 bash - -rwxr-xr-x 3 root root 39224 Sep 18 2022 bunzip2 - -rwxr-xr-x 3 root root 39224 Sep 18 2022 bzcat - lrwxrwxrwx 1 root root 6 Sep 18 2022 bzcmp -> bzdiff - -rwxr-xr-x 1 root root 2225 Sep 18 2022 bzdiff - lrwxrwxrwx 1 root root 6 Sep 18 2022 bzegrep -> bzgrep - -rwxr-xr-x 1 root root 4893 Nov 27 2021 bzexe - lrwxrwxrwx 1 root root 6 Sep 18 2022 bzfgrep -> bzgrep - -rwxr-xr-x 1 root root 3775 Sep 18 2022 bzgrep - -rwxr-xr-x 3 root root 39224 Sep 18 2022 bzip2 - -rwxr-xr-x 1 root root 14568 Sep 18 2022 bzip2recover - lrwxrwxrwx 1 root root 6 Sep 18 2022 bzless -> bzmore - -rwxr-xr-x 1 root root 1297 Sep 18 2022 bzmore - -rwxr-xr-x 1 root root 44016 Sep 20 2022 cat - -rwxr-xr-x 1 root root 68656 Sep 20 2022 chgrp - -rwxr-xr-x 1 root root 64496 Sep 20 2022 chmod - -rwxr-xr-x 1 root root 72752 Sep 20 2022 chown - -rwxr-xr-x 1 root root 151152 Sep 20 2022 cp - -rwxr-xr-x 1 root root 125640 Jan 5 2023 dash - -rwxr-xr-x 1 root root 121904 Sep 20 2022 date - -rwxr-xr-x 1 root root 89240 Sep 20 2022 dd - -rwxr-xr-x 1 root root 102200 Sep 20 2022 df - -rwxr-xr-x 1 root root 151344 Sep 20 2022 dir - -rwxr-xr-x 1 root root 88656 Mar 22 2023 dmesg - lrwxrwxrwx 1 root root 8 Dec 19 2022 dnsdomainname -> hostname - lrwxrwxrwx 1 root root 8 Dec 19 2022 domainname -> hostname - -rwxr-xr-x 1 root root 43856 Sep 20 2022 echo - -rwxr-xr-x 1 root root 41 Jan 24 2023 egrep - -rwxr-xr-x 1 root root 35664 Sep 20 2022 false - -rwxr-xr-x 1 root root 41 Jan 24 2023 fgrep - -rwxr-xr-x 1 root root 85600 Mar 22 2023 findmnt - -rwsr-xr-x 1 root root 35128 Mar 22 2023 fusermount - -rwxr-xr-x 1 root root 203152 Jan 24 2023 grep - -rwxr-xr-x 2 root root 2346 Apr 9 2022 gunzip - -rwxr-xr-x 1 root root 6447 Apr 9 2022 gzexe - -rwxr-xr-x 1 root root 98136 Apr 9 2022 gzip - -rwxr-xr-x 1 root root 22680 Dec 19 2022 hostname - -rwxr-xr-x 1 root root 72824 Sep 20 2022 ln - -rwxr-xr-x 1 root root 53024 Mar 23 2023 login - -rwxr-xr-x 1 root root 151344 Sep 20 2022 ls - -rwxr-xr-x 1 root root 207168 Mar 22 2023 lsblk - -rwxr-xr-x 1 root root 97552 Sep 20 2022 mkdir - -rwxr-xr-x 1 root root 72912 Sep 20 2022 mknod - -rwxr-xr-x 1 root root 43952 Sep 20 2022 mktemp - -rwxr-xr-x 1 root root 59712 Mar 22 2023 more - -rwsr-xr-x 1 root root 59704 Mar 22 2023 mount - -rwxr-xr-x 1 root root 18744 Mar 22 2023 mountpoint - -rwxr-xr-x 1 root root 142968 Sep 20 2022 mv - lrwxrwxrwx 1 root root 8 Dec 19 2022 nisdomainname -> hostname - lrwxrwxrwx 1 root root 14 Apr 2 2023 pidof -> /sbin/killall5 - -rwxr-xr-x 1 root root 43952 Sep 20 2022 pwd - lrwxrwxrwx 1 root root 4 Feb 12 2023 rbash -> bash - -rwxr-xr-x 1 root root 52112 Sep 20 2022 readlink - -rwxr-xr-x 1 root root 72752 Sep 20 2022 rm - -rwxr-xr-x 1 root root 56240 Sep 20 2022 rmdir - -rwxr-xr-x 1 root root 27560 Nov 2 2022 run-parts - -rwxr-xr-x 1 root root 126424 Jan 5 2023 sed - lrwxrwxrwx 1 root root 4 Jan 5 2023 sh -> dash - -rwxr-xr-x 1 root root 43888 Sep 20 2022 sleep - -rwxr-xr-x 1 root root 85008 Sep 20 2022 stty - -rwsr-xr-x 1 root root 72000 Mar 22 2023 su - -rwxr-xr-x 1 root root 39824 Sep 20 2022 sync - -rwxr-xr-x 1 root root 531984 Apr 6 2023 tar - -rwxr-xr-x 1 root root 14520 Nov 2 2022 tempfile - -rwxr-xr-x 1 root root 109616 Sep 20 2022 touch - -rwxr-xr-x 1 root root 35664 Sep 20 2022 true - -rwxr-xr-x 1 root root 14568 Mar 22 2023 ulockmgr_server - -rwsr-xr-x 1 root root 35128 Mar 22 2023 umount - -rwxr-xr-x 1 root root 43888 Sep 20 2022 uname - -rwxr-xr-x 2 root root 2346 Apr 9 2022 uncompress - -rwxr-xr-x 1 root root 151344 Sep 20 2022 vdir - -rwxr-xr-x 1 root root 72024 Mar 22 2023 wdctl - lrwxrwxrwx 1 root root 8 Dec 19 2022 ypdomainname -> hostname - -rwxr-xr-x 1 root root 1984 Apr 9 2022 zcat - -rwxr-xr-x 1 root root 1678 Apr 9 2022 zcmp - -rwxr-xr-x 1 root root 6460 Apr 9 2022 zdiff - -rwxr-xr-x 1 root root 29 Apr 9 2022 zegrep - -rwxr-xr-x 1 root root 29 Apr 9 2022 zfgrep - -rwxr-xr-x 1 root root 2081 Apr 9 2022 zforce - -rwxr-xr-x 1 root root 8103 Apr 9 2022 zgrep - -rwxr-xr-x 1 root root 2206 Apr 9 2022 zless - -rwxr-xr-x 1 root root 1842 Apr 9 2022 zmore - -rwxr-xr-x 1 root root 4577 Apr 9 2022 znew -I: user script /srv/workspace/pbuilder/178780/tmp/hooks/D02_print_environment finished + -rwxr-xr-x 1 root root 1265648 Feb 13 10:05 bash + -rwxr-xr-x 3 root root 39224 Sep 19 2022 bunzip2 + -rwxr-xr-x 3 root root 39224 Sep 19 2022 bzcat + lrwxrwxrwx 1 root root 6 Sep 19 2022 bzcmp -> bzdiff + -rwxr-xr-x 1 root root 2225 Sep 19 2022 bzdiff + lrwxrwxrwx 1 root root 6 Sep 19 2022 bzegrep -> bzgrep + -rwxr-xr-x 1 root root 4893 Nov 28 2021 bzexe + lrwxrwxrwx 1 root root 6 Sep 19 2022 bzfgrep -> bzgrep + -rwxr-xr-x 1 root root 3775 Sep 19 2022 bzgrep + -rwxr-xr-x 3 root root 39224 Sep 19 2022 bzip2 + -rwxr-xr-x 1 root root 14568 Sep 19 2022 bzip2recover + lrwxrwxrwx 1 root root 6 Sep 19 2022 bzless -> bzmore + -rwxr-xr-x 1 root root 1297 Sep 19 2022 bzmore + -rwxr-xr-x 1 root root 44016 Sep 21 2022 cat + -rwxr-xr-x 1 root root 68656 Sep 21 2022 chgrp + -rwxr-xr-x 1 root root 64496 Sep 21 2022 chmod + -rwxr-xr-x 1 root root 72752 Sep 21 2022 chown + -rwxr-xr-x 1 root root 151152 Sep 21 2022 cp + -rwxr-xr-x 1 root root 125640 Jan 6 03:20 dash + -rwxr-xr-x 1 root root 121904 Sep 21 2022 date + -rwxr-xr-x 1 root root 89240 Sep 21 2022 dd + -rwxr-xr-x 1 root root 102200 Sep 21 2022 df + -rwxr-xr-x 1 root root 151344 Sep 21 2022 dir + -rwxr-xr-x 1 root root 88656 Mar 24 00:02 dmesg + lrwxrwxrwx 1 root root 8 Dec 20 03:33 dnsdomainname -> hostname + lrwxrwxrwx 1 root root 8 Dec 20 03:33 domainname -> hostname + -rwxr-xr-x 1 root root 43856 Sep 21 2022 echo + -rwxr-xr-x 1 root root 41 Jan 25 04:43 egrep + -rwxr-xr-x 1 root root 35664 Sep 21 2022 false + -rwxr-xr-x 1 root root 41 Jan 25 04:43 fgrep + -rwxr-xr-x 1 root root 85600 Mar 24 00:02 findmnt + -rwsr-xr-x 1 root root 35128 Mar 23 22:35 fusermount + -rwxr-xr-x 1 root root 203152 Jan 25 04:43 grep + -rwxr-xr-x 2 root root 2346 Apr 10 2022 gunzip + -rwxr-xr-x 1 root root 6447 Apr 10 2022 gzexe + -rwxr-xr-x 1 root root 98136 Apr 10 2022 gzip + -rwxr-xr-x 1 root root 22680 Dec 20 03:33 hostname + -rwxr-xr-x 1 root root 72824 Sep 21 2022 ln + -rwxr-xr-x 1 root root 53024 Mar 24 02:40 login + -rwxr-xr-x 1 root root 151344 Sep 21 2022 ls + -rwxr-xr-x 1 root root 207168 Mar 24 00:02 lsblk + -rwxr-xr-x 1 root root 97552 Sep 21 2022 mkdir + -rwxr-xr-x 1 root root 72912 Sep 21 2022 mknod + -rwxr-xr-x 1 root root 43952 Sep 21 2022 mktemp + -rwxr-xr-x 1 root root 59712 Mar 24 00:02 more + -rwsr-xr-x 1 root root 59704 Mar 24 00:02 mount + -rwxr-xr-x 1 root root 18744 Mar 24 00:02 mountpoint + -rwxr-xr-x 1 root root 142968 Sep 21 2022 mv + lrwxrwxrwx 1 root root 8 Dec 20 03:33 nisdomainname -> hostname + lrwxrwxrwx 1 root root 14 Apr 3 20:25 pidof -> /sbin/killall5 + -rwxr-xr-x 1 root root 43952 Sep 21 2022 pwd + lrwxrwxrwx 1 root root 4 Feb 13 10:05 rbash -> bash + -rwxr-xr-x 1 root root 52112 Sep 21 2022 readlink + -rwxr-xr-x 1 root root 72752 Sep 21 2022 rm + -rwxr-xr-x 1 root root 56240 Sep 21 2022 rmdir + -rwxr-xr-x 1 root root 27560 Nov 3 06:31 run-parts + -rwxr-xr-x 1 root root 126424 Jan 6 09:55 sed + lrwxrwxrwx 1 root root 9 Apr 22 19:41 sh -> /bin/bash + -rwxr-xr-x 1 root root 43888 Sep 21 2022 sleep + -rwxr-xr-x 1 root root 85008 Sep 21 2022 stty + -rwsr-xr-x 1 root root 72000 Mar 24 00:02 su + -rwxr-xr-x 1 root root 39824 Sep 21 2022 sync + -rwxr-xr-x 1 root root 531984 Apr 7 04:25 tar + -rwxr-xr-x 1 root root 14520 Nov 3 06:31 tempfile + -rwxr-xr-x 1 root root 109616 Sep 21 2022 touch + -rwxr-xr-x 1 root root 35664 Sep 21 2022 true + -rwxr-xr-x 1 root root 14568 Mar 23 22:35 ulockmgr_server + -rwsr-xr-x 1 root root 35128 Mar 24 00:02 umount + -rwxr-xr-x 1 root root 43888 Sep 21 2022 uname + -rwxr-xr-x 2 root root 2346 Apr 10 2022 uncompress + -rwxr-xr-x 1 root root 151344 Sep 21 2022 vdir + -rwxr-xr-x 1 root root 72024 Mar 24 00:02 wdctl + lrwxrwxrwx 1 root root 8 Dec 20 03:33 ypdomainname -> hostname + -rwxr-xr-x 1 root root 1984 Apr 10 2022 zcat + -rwxr-xr-x 1 root root 1678 Apr 10 2022 zcmp + -rwxr-xr-x 1 root root 6460 Apr 10 2022 zdiff + -rwxr-xr-x 1 root root 29 Apr 10 2022 zegrep + -rwxr-xr-x 1 root root 29 Apr 10 2022 zfgrep + -rwxr-xr-x 1 root root 2081 Apr 10 2022 zforce + -rwxr-xr-x 1 root root 8103 Apr 10 2022 zgrep + -rwxr-xr-x 1 root root 2206 Apr 10 2022 zless + -rwxr-xr-x 1 root root 1842 Apr 10 2022 zmore + -rwxr-xr-x 1 root root 4577 Apr 10 2022 znew +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy @@ -344,7 +376,7 @@ Get: 115 http://deb.debian.org/debian bookworm/main amd64 python3-retry all 0.9.2-2 [7020 B] Get: 116 http://deb.debian.org/debian bookworm/main amd64 python3-roundrobin all 0.0.4-2 [4324 B] Get: 117 http://deb.debian.org/debian bookworm/main amd64 python3-zmq amd64 24.0.1-4+b1 [263 kB] -Fetched 39.3 MB in 1s (30.3 MB/s) +Fetched 39.3 MB in 1s (28.7 MB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package fonts-lato. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19596 files and directories currently installed.) @@ -838,10 +870,17 @@ Reading package lists... Building dependency tree... Reading state information... +usrmerge is already the newest version (35). fakeroot is already the newest version (1.31-1.2). 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. I: Building the package -I: Running cd /build/locust-2.12.1/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../locust_2.12.1-1_source.changes +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/A99_set_merged_usr starting +Re-configuring usrmerge... +removed '/etc/unsupported-skip-usrmerge-conversion' +The system has been successfully converted. +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/A99_set_merged_usr finished +hostname: Name or service not known +I: Running cd /build/locust-2.12.1/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-genchanges -S > ../locust_2.12.1-1_source.changes dpkg-buildpackage: info: source package locust dpkg-buildpackage: info: source version 2.12.1-1 dpkg-buildpackage: info: source distribution unstable @@ -1229,7 +1268,7 @@ Copying locust.egg-info to build/bdist.linux-x86_64/wheel/locust-0.0.0.egg-info running install_scripts creating build/bdist.linux-x86_64/wheel/locust-0.0.0.dist-info/WHEEL -creating '/build/locust-2.12.1/.pybuild/cpython3_3.11_locust/.tmp-htbcdn_c/locust-0.0.0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it +creating '/build/locust-2.12.1/.pybuild/cpython3_3.11_locust/.tmp-bbrsv57q/locust-0.0.0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it adding 'locust/__init__.py' adding 'locust/__main__.py' adding 'locust/_version.py' @@ -1332,7 +1371,7 @@ make[1]: Entering directory '/build/locust-2.12.1' # examples/test_data_management.py - attempted network access during pytest collection phase # TestMasterWorkerRunners times out after a long time -http_proxy= https_proxy= PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/build/locust-2.12.1/debian PYBUILD_SYSTEM=custom PYBUILD_TEST_ARGS="PYTHONPATH={build_dir} {interpreter} -m pytest -v --ignore=examples/test_data_management.py -k 'not TestMasterWorkerRunners'" dh_auto_test +http_proxy= https_proxy= PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path:/build/locust-2.12.1/debian PYBUILD_SYSTEM=custom PYBUILD_TEST_ARGS="PYTHONPATH={build_dir} {interpreter} -m pytest -v --ignore=examples/test_data_management.py -k 'not TestMasterWorkerRunners'" dh_auto_test I: pybuild base:240: PYTHONPATH=/build/locust-2.12.1/.pybuild/cpython3_3.11_locust/build python3.11 -m pytest -v --ignore=examples/test_data_management.py -k 'not TestMasterWorkerRunners' ============================= test session starts ============================== platform linux -- Python 3.11.2, pytest-7.2.1, pluggy-1.0.0+repack -- /usr/bin/python3.11 @@ -1361,9 +1400,9 @@ locust/test/test_dispatch.py::TestRampUpThenDownThenUp::test_ramp_up_then_down_then_up PASSED [ 3%] locust/test/test_dispatch.py::TestDispatchUsersToWorkersHavingTheSameUsersAsTheTarget::test_dispatch_users_to_3_workers PASSED [ 3%] locust/test/test_dispatch.py::TestDistributionIsRespectedDuringDispatch::test_dispatch_75_users_to_4_workers_with_spawn_rate_of_5 PASSED [ 4%] -locust/test/test_dispatch.py::TestLargeScale::test_distribute_users FAILED [ 4%] +locust/test/test_dispatch.py::TestLargeScale::test_distribute_users PASSED [ 4%] locust/test/test_dispatch.py::TestLargeScale::test_ramp_down_from_100_000_to_0_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate PASSED [ 4%] -locust/test/test_dispatch.py::TestLargeScale::test_ramp_up_from_0_to_100_000_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate FAILED [ 4%] +locust/test/test_dispatch.py::TestLargeScale::test_ramp_up_from_0_to_100_000_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate PASSED [ 4%] locust/test/test_dispatch.py::TestSmallConsecutiveRamping::test_consecutive_ramp_up_and_ramp_down PASSED [ 4%] locust/test/test_dispatch.py::TestRampingMiscellaneous::test_spawn_rate_greater_than_target_user_count PASSED [ 5%] locust/test/test_dispatch.py::TestRemoveWorker::test_remove_last_worker PASSED [ 5%] @@ -1437,7 +1476,7 @@ locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_get PASSED [ 18%] locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_get_absolute_url PASSED [ 19%] locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_head PASSED [ 19%] -locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_pool_concurrency FAILED [ 19%] +locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_pool_concurrency PASSED [ 19%] locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_pool_per_user_instance PASSED [ 19%] locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_post PASSED [ 19%] locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_put PASSED [ 20%] @@ -1695,8 +1734,8 @@ locust/test/test_runners.py::TestMessageSerializing::test_message_serialize PASSED [ 70%] locust/test/test_runners.py::TestStopTimeout::test_gracefully_handle_exceptions_in_listener PASSED [ 70%] locust/test/test_runners.py::TestStopTimeout::test_kill_locusts_with_stop_timeout FAILED [ 70%] -locust/test/test_runners.py::TestStopTimeout::test_stop_timeout FAILED [ 70%] -locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_during_on_start PASSED [ 71%] +locust/test/test_runners.py::TestStopTimeout::test_stop_timeout PASSED [ 70%] +locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_during_on_start FAILED [ 71%] locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_exit_during_wait PASSED [ 71%] locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_with_interrupt PASSED [ 71%] locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_with_interrupt_no_reschedule PASSED [ 71%] @@ -1736,7 +1775,7 @@ locust/test/test_stats.py::TestStatsPrinting::test_print_stats PASSED [ 78%] locust/test/test_stats.py::TestCsvStats::test_csv_stats_on_master_from_aggregated_stats PASSED [ 78%] locust/test/test_stats.py::TestCsvStats::test_csv_stats_writer PASSED [ 78%] -locust/test/test_stats.py::TestCsvStats::test_csv_stats_writer_full_history FAILED [ 79%] +locust/test/test_stats.py::TestCsvStats::test_csv_stats_writer_full_history PASSED [ 79%] locust/test/test_stats.py::TestCsvStats::test_requests_csv_quote_escaping PASSED [ 79%] locust/test/test_stats.py::TestCsvStats::test_stats_history FAILED [ 79%] locust/test/test_stats.py::TestCsvStats::test_user_count_in_csv_history_stats FAILED [ 79%] @@ -1790,7 +1829,7 @@ locust/test/test_util.py::TestRounding::test_rounding_up PASSED [ 89%] locust/test/test_wait_time.py::TestWaitTime::test_between PASSED [ 89%] locust/test/test_wait_time.py::TestWaitTime::test_constant PASSED [ 89%] -locust/test/test_wait_time.py::TestWaitTime::test_constant_throughput PASSED [ 89%] +locust/test/test_wait_time.py::TestWaitTime::test_constant_throughput FAILED [ 89%] locust/test/test_wait_time.py::TestWaitTime::test_default_wait_time PASSED [ 90%] locust/test/test_web.py::TestWebUI::test_exceptions PASSED [ 90%] locust/test/test_web.py::TestWebUI::test_exceptions_csv PASSED [ 90%] @@ -1844,94 +1883,6 @@ locust/test/test_zmqrpc.py::ZMQRPC_tests::test_rpc_error PASSED [100%] =================================== FAILURES =================================== -_____________________ TestLargeScale.test_distribute_users _____________________ - -self = - - def test_distribute_users(self): - for user_classes in [self.weighted_user_classes, self.fixed_user_classes_1M, self.mixed_users]: - workers = [WorkerNode(str(i)) for i in range(10_000)] - - target_user_count = 1_000_000 - - users_dispatcher = UsersDispatcher(worker_nodes=workers, user_classes=user_classes) - - ts = time.perf_counter() - users_on_workers, user_gen, worker_gen, active_users = users_dispatcher._distribute_users( - target_user_count=target_user_count - ) - delta = time.perf_counter() - ts - - # Because tests are run with coverage, the code will be slower. - # We set the pass criterion to 7000ms, but in real life, the - # `_distribute_users` method runs faster than this. -> self.assertLessEqual(1000 * delta, 7000) -E AssertionError: 8192.842537988327 not less than or equal to 7000 - -locust/test/test_dispatch.py:2072: AssertionError -_ TestLargeScale.test_ramp_up_from_0_to_100_000_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate _ - -self = - - def test_ramp_up_from_0_to_100_000_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate(self): - for user_classes in [ - self.weighted_user_classes, - self.fixed_user_classes_1M, - self.fixed_user_classes_10k, - self.mixed_users, - ]: - workers = [WorkerNode(str(i)) for i in range(1000)] - - target_user_count = 100_000 - - users_dispatcher = UsersDispatcher(worker_nodes=workers, user_classes=user_classes) - users_dispatcher.new_dispatch(target_user_count=target_user_count, spawn_rate=5_000) - users_dispatcher._wait_between_dispatch = 0 - - all_dispatched_users = list(users_dispatcher) - - tol = 0.2 -> self.assertTrue( - all( - dispatch_iteration_duration <= tol - for dispatch_iteration_duration in users_dispatcher.dispatch_iteration_durations - ), - "One or more dispatch took more than {:.0f}s to compute (max = {}ms)".format( - tol * 1000, 1000 * max(users_dispatcher.dispatch_iteration_durations) - ), - ) -E AssertionError: False is not true : One or more dispatch took more than 200s to compute (max = 257.9829110036371ms) - -locust/test/test_dispatch.py:2094: AssertionError -______________ TestFastHttpUserClass.test_client_pool_concurrency ______________ - -self = - - def test_client_pool_concurrency(self): - class MyUser(FastHttpUser): - host = "http://127.0.0.1:%i" % self.port - - @task - def t(self): - def concurrent_request(url): - response = self.client.get(url) - assert response.status_code == 200 - - pool = gevent.pool.Pool() - urls = ["/slow?delay=0.2"] * 20 # these urls are all the same, but they could be different - for url in urls: - pool.spawn(concurrent_request, url) - pool.join() - - user = MyUser(self.environment) - before_requests = time.time() - user.t() - after_requests = time.time() - expected_delta = 0.4 # 20 requests with concurrency 10 and response time 0.2 -> self.assertAlmostEqual(before_requests + expected_delta, after_requests, delta=0.1) -E AssertionError: 1716551667.3420594 != 1716551667.485948 within 0.1 delta (0.1438887119293213 difference) - -locust/test/test_fasthttp.py:578: AssertionError _____________________ TestLoggingOptions.test_log_to_file ______________________ self = @@ -1982,8 +1933,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmprju7nia5_locustfile.py', '-u', '1', '-r', ...] +self = +args = ['locust', '-f', '/tmp/tmp216xee59_locustfile.py', '-u', '1', '-r', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -2290,8 +2241,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmpl3y3xvq2_locustfile.py', '-u', '1', '-r', ...] +self = +args = ['locust', '-f', '/tmp/tmpz6k9qhy7_locustfile.py', '-u', '1', '-r', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -2594,8 +2545,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmptsnszj15_locustfile.py', '-u', '1', '-r', ...] +self = +args = ['locust', '-f', '/tmp/tmpv6j7fhll_locustfile.py', '-u', '1', '-r', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -2873,7 +2824,7 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = +self = args = ['locust', '--help'], executable = 'locust', preexec_fn = None close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False @@ -3190,8 +3141,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551699_125029_96984.py,/tmp/tmpqcx1wb82_locustfile.py', '--web-port', '59179', '--autostart', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142292_3126721_45210.py,/tmp/tmpciay7t2j_locustfile.py', '--web-port', '55901', '--autostart', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -3492,8 +3443,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551699_4155607_54545.py', '--web-port', '40495', '--autostart', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142292_5126407_71818.py', '--web-port', '52145', '--autostart', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -3782,8 +3733,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551699_681133_73856.py', '--web-port', '58293', '-t', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142292_8081417_77071.py', '--web-port', '41705', '-t', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -4068,8 +4019,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551700_1839473_64209.py', '--web-port', '47195', '--autostart'] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142293_1356783_60050.py', '--web-port', '40491', '--autostart'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -4382,7 +4333,7 @@ output = proc.communicate()[0] self.assertNotIn("User1 is running", output) > self.assertIn("User2 is running", output) -E AssertionError: 'User2 is running' not found in '/build/locust-2.12.1/debian/locust: 2: from: not found\n/build/locust-2.12.1/debian/locust: 5: Syntax error: end of file unexpected\n' +E AssertionError: 'User2 is running' not found in '/build/locust-2.12.1/debian/locust: line 2: from: command not found\n/build/locust-2.12.1/debian/locust: line 5: syntax error: unexpected end of file\n' locust/test/test_main.py:936: AssertionError _______________ StandaloneIntegrationTests.test_custom_arguments _______________ @@ -4421,8 +4372,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp3vr4c5hy_locustfile.py', '--custom-string-arg', 'command_line_value', '--web-port', ...] +self = +args = ['locust', '-f', '/tmp/tmp7ae7h85o_locustfile.py', '--custom-string-arg', 'command_line_value', '--web-port', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -4717,8 +4668,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmpqywkykgi_locustfile.py', '--autostart'] +self = +args = ['locust', '-f', '/tmp/tmpnyx35mh9_locustfile.py', '--autostart'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -5007,8 +4958,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp9fty3kfm_locustfile.py'], executable = 'locust' +self = +args = ['locust', '-f', '/tmp/tmpu5deeb8t_locustfile.py'], executable = 'locust' preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False p2cread = -1, p2cwrite = -1, c2pread = 11, c2pwrite = 78, errread = 79 @@ -5299,8 +5250,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551701_4192975_73966.py', '--host', 'https://test.com/', '--run-time', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142294_6265326_23102.py', '--host', 'https://test.com/', '--run-time', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -5597,8 +5548,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551701_6051211_97361.py', '--host', 'https://test.com/', '--headless', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142295_0558922_43643.py', '--host', 'https://test.com/', '--headless', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -5895,8 +5846,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmpc2grttnn_locustfile.py,/tmp/tmpzx2ve7oe_locustfile.py'] +self = +args = ['locust', '-f', '/tmp/tmpcs5g9fnl_locustfile.py,/tmp/tmpzhiqgrw7_locustfile.py'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -6179,8 +6130,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp9ivqqmp7_locustfile.py,/tmp/tmp7sh0l8bw_locustfile.py'] +self = +args = ['locust', '-f', '/tmp/tmp78nafyic_locustfile.py,/tmp/tmp4bh2ycu2_locustfile.py'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -6452,8 +6403,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp_1gzmy_x'], executable = 'locust' +self = +args = ['locust', '-f', '/tmp/tmp3fos4ry5'], executable = 'locust' preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False p2cread = -1, p2cwrite = -1, c2pread = 11, c2pwrite = 78, errread = 79 @@ -6748,8 +6699,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551702_5715394_86240.py', '--headless', '-t', '1', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142296_4470417_59256.py', '--headless', '-t', '1', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -7035,8 +6986,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551702_8211079_39192.py', '--host', 'https://test.com/', '--headless', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142296_6605854_93796.py', '--host', 'https://test.com/', '--headless', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -7314,7 +7265,7 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = +self = args = ['locust', '--help'], executable = 'locust', preexec_fn = None close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False @@ -7608,8 +7559,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551703_20291_1032.py', '--host', 'https://test.com/', '--run-time', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142297_1273649_32460.py', '--host', 'https://test.com/', '--run-time', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -7926,7 +7877,7 @@ output = proc.communicate()[0] stdin.close() > self.assertIn("Ramping to 1 users at a rate of 100.00 per second", output) -E AssertionError: 'Ramping to 1 users at a rate of 100.00 per second' not found in '/build/locust-2.12.1/debian/locust: 2: from: not found\n/build/locust-2.12.1/debian/locust: 5: Syntax error: end of file unexpected\n' +E AssertionError: 'Ramping to 1 users at a rate of 100.00 per second' not found in '/build/locust-2.12.1/debian/locust: line 2: from: command not found\n/build/locust-2.12.1/debian/locust: line 5: syntax error: unexpected end of file\n' locust/test/test_main.py:778: AssertionError __ StandaloneIntegrationTests.test_no_error_when_same_userclass_in_two_files ___ @@ -7952,8 +7903,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmprz81sf86_locustfile.py,/tmp/tmpkfxi6f9__locustfile.py', '-t', '1', '--headless'] +self = +args = ['locust', '-f', '/tmp/tmphdvc48_f_locustfile.py,/tmp/tmp4c71nlpy_locustfile.py', '-t', '1', '--headless'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -8213,7 +8164,7 @@ /usr/lib/python3/dist-packages/gevent/subprocess.py:1866: OSError ----------------------------- Captured stdout call ----------------------------- -from tmprz81sf86_locustfile import TestUser1 +from tmphdvc48_f_locustfile import TestUser1 ___ StandaloneIntegrationTests.test_run_autostart_with_multiple_locustfiles ____ @@ -8257,8 +8208,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp1zh74hq7', '--autostart', '-u', '2', ...] +self = +args = ['locust', '-f', '/tmp/tmp_nne_8gu', '--autostart', '-u', '2', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -8558,8 +8509,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmpi1lrxvdg', '--headless', '-u', '2', ...] +self = +args = ['locust', '-f', '/tmp/tmpg36fs40d', '--headless', '-u', '2', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -8875,8 +8826,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551710_1560144_38304.py,/tmp/tmp0yr7sa_z_locustfile.py', '--host', 'https://test.com/', '--headless', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142304_2120142_22714.py,/tmp/tmpk7uf7uq6_locustfile.py', '--host', 'https://test.com/', '--headless', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -9154,8 +9105,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp_gyhsou2_locustfile.py,/tmp/tmp2e38278q_locustfile.py', '--class-picker'] +self = +args = ['locust', '-f', '/tmp/tmp1h15jlrn_locustfile.py,/tmp/tmpi9rteffc_locustfile.py', '--class-picker'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -9445,7 +9396,7 @@ output = proc.communicate()[0] > self.assertIn("Ramping to 10 users at a rate of 10.00 per second", output) -E AssertionError: 'Ramping to 10 users at a rate of 10.00 per second' not found in '/build/locust-2.12.1/debian/locust: 2: from: not found\n/build/locust-2.12.1/debian/locust: 5: Syntax error: end of file unexpected\n' +E AssertionError: 'Ramping to 10 users at a rate of 10.00 per second' not found in '/build/locust-2.12.1/debian/locust: line 2: from: command not found\n/build/locust-2.12.1/debian/locust: line 5: syntax error: unexpected end of file\n' locust/test/test_main.py:878: AssertionError _____________ StandaloneIntegrationTests.test_spawning_with_fixed ______________ @@ -9504,7 +9455,7 @@ output = proc.communicate()[0] > self.assertIn("Ramping to 10 users at a rate of 10.00 per second", output) -E AssertionError: 'Ramping to 10 users at a rate of 10.00 per second' not found in '/build/locust-2.12.1/debian/locust: 2: from: not found\n/build/locust-2.12.1/debian/locust: 5: Syntax error: end of file unexpected\n' +E AssertionError: 'Ramping to 10 users at a rate of 10.00 per second' not found in '/build/locust-2.12.1/debian/locust: line 2: from: command not found\n/build/locust-2.12.1/debian/locust: line 5: syntax error: unexpected end of file\n' locust/test/test_main.py:843: AssertionError _________________ StandaloneIntegrationTests.test_web_options __________________ @@ -9531,8 +9482,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551710_9397626_96974.py', '--web-host', '127.0.0.2', '--web-port', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142305_1453645_68974.py', '--web-host', '127.0.0.2', '--web-port', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -9815,8 +9766,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmpan69k5fm_locustfile.py'], executable = 'locust' +self = +args = ['locust', '-f', '/tmp/tmp8aocyfsr_locustfile.py'], executable = 'locust' preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False p2cread = -1, p2cwrite = -1, c2pread = 11, c2pwrite = 79, errread = 80 @@ -10090,8 +10041,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551711_4052985_86925.py,/build/locust-2.12.1/locust/test/mock_locustfile_1716551711_4057608_18293.py'] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142305_6320658_36866.py,/build/locust-2.12.1/locust/test/mock_locustfile_1682142305_6323533_40973.py'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -10365,8 +10316,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/tmp/tmp81rtjx7p'], executable = 'locust' +self = +args = ['locust', '-f', '/tmp/tmph7yo8c5k'], executable = 'locust' preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False p2cread = -1, p2cwrite = -1, c2pread = 11, c2pwrite = 79, errread = 80 @@ -10669,8 +10620,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551711_7671833_35428.py,/tmp/tmplhb54_h5_locustfile.py'] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142306_035903_92274.py,/tmp/tmpp_hlc4cz_locustfile.py'] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -10971,8 +10922,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551711_9707363_76743.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142306_2318761_38467.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -11283,8 +11234,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551712_3176918_29574.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142306_545112_95610.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -11587,8 +11538,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551712_51253_7916.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142306_7755034_76594.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -11898,8 +11849,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551712_7041698_97935.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142307_0649333_93727.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -12186,8 +12137,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551712_9664903_73428.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142307_245911_48614.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -12465,7 +12416,7 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = +self = args = ['locust', '--help'], executable = 'locust', preexec_fn = None close_fds = True, pass_fds = (), cwd = None, env = None universal_newlines = None, startupinfo = None, creationflags = 0, shell = False @@ -12766,8 +12717,8 @@ self._execute_child(args, executable, preexec_fn, close_fds, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = -args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1716551713_5529037_40905.py', '--headless', '--master', '--expect-workers', ...] +self = +args = ['locust', '-f', '/build/locust-2.12.1/locust/test/mock_locustfile_1682142307_803815_45866.py', '--headless', '--master', '--expect-workers', ...] executable = 'locust', preexec_fn = None, close_fds = True, pass_fds = () cwd = None, env = None, universal_newlines = None, startupinfo = None creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 11 @@ -13096,7 +13047,7 @@ runner.spawning_greenlet.join() delta = time.time() - ts > self.assertTrue(0 <= delta <= 1.05, f"Expected user count to decrease to 2 in 1s, instead it took {delta:f}") -E AssertionError: False is not true : Expected user count to decrease to 2 in 1s, instead it took 1.076156 +E AssertionError: False is not true : Expected user count to decrease to 2 in 1s, instead it took 1.056009 locust/test/test_runners.py:482: AssertionError ------------------------------ Captured log call ------------------------------- @@ -13189,10 +13140,6 @@ WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines _________________ TestMasterRunner.test_custom_shape_scale_up __________________ self = @@ -13310,14 +13257,14 @@ return self._worker_nodes[k] ~~~~~~~~~~~~~~~~~~^^^ KeyError: 'fake_client1' -2024-05-24T11:56:00Z >> failed with KeyError +2023-04-22T05:45:54Z >> failed with KeyError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client1). That's not going to work. ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client2). That's not going to work. ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client3). That's not going to work. WARNING locust.runners:runners.py:751 You can't start a distributed test before at least one worker processes has connected -CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> +CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> Traceback (most recent call last): File "src/gevent/greenlet.py", line 908, in gevent._gevent_cgreenlet.Greenlet.run File "/build/locust-2.12.1/locust/runners.py", line 1088, in client_listener @@ -13327,14 +13274,6 @@ return self._worker_nodes[k] ^^^^^^^^^^^^^^^^^ KeyError: 'fake_client1' -INFO locust.runners:runners.py:372 Shape test updating to 1 users at 5.00 spawn rate -WARNING locust.runners:runners.py:751 You can't start a distributed test before at least one worker processes has connected -INFO locust.runners:runners.py:957 Worker 0 failed to send heartbeat, setting state to missing. -INFO locust.runners:runners.py:957 Worker 1 failed to send heartbeat, setting state to missing. -INFO locust.runners:runners.py:957 Worker 2 failed to send heartbeat, setting state to missing. -INFO locust.runners:runners.py:957 Worker 3 failed to send heartbeat, setting state to missing. -INFO locust.runners:runners.py:957 Worker 4 failed to send heartbeat, setting state to missing. -INFO locust.runners:runners.py:965 The last worker went missing, stopping test. ____________ TestMasterRunner.test_last_worker_quitting_stops_test _____________ self = @@ -13372,13 +13311,13 @@ return self._worker_nodes[k] ~~~~~~~~~~~~~~~~~~^^^ KeyError: 'fake_client1' -2024-05-24T11:56:00Z >> failed with KeyError +2023-04-22T05:45:55Z >> failed with KeyError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client1). That's not going to work. ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client2). That's not going to work. WARNING locust.runners:runners.py:751 You can't start a distributed test before at least one worker processes has connected -CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> +CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> Traceback (most recent call last): File "src/gevent/greenlet.py", line 908, in gevent._gevent_cgreenlet.Greenlet.run File "/build/locust-2.12.1/locust/runners.py", line 1088, in client_listener @@ -13423,7 +13362,7 @@ locust/test/test_runners.py:2093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = , k = 'fake_client' +self = , k = 'fake_client' def __getitem__(self, k: str) -> WorkerNode: > return self._worker_nodes[k] @@ -13432,9 +13371,14 @@ locust/runners.py:644: KeyError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines INFO locust.runners:runners.py:356 Shape test stopping INFO locust.runners:runners.py:356 Shape test stopping +INFO locust.runners:runners.py:957 Worker 0 failed to send heartbeat, setting state to missing. +INFO locust.runners:runners.py:957 Worker 1 failed to send heartbeat, setting state to missing. +INFO locust.runners:runners.py:957 Worker 2 failed to send heartbeat, setting state to missing. +INFO locust.runners:runners.py:957 Worker 3 failed to send heartbeat, setting state to missing. +INFO locust.runners:runners.py:957 Worker 4 failed to send heartbeat, setting state to missing. +INFO locust.runners:runners.py:965 The last worker went missing, stopping test. ________________ TestMasterRunner.test_master_reset_connection _________________ self = @@ -13456,8 +13400,6 @@ locust/test/test_runners.py:2868: AssertionError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines INFO locust.runners:runners.py:983 Resetting RPC server and all worker connections. ________ TestMasterRunner.test_rebalance_locust_users_on_worker_connect ________ @@ -13495,12 +13437,6 @@ locust/test/test_runners.py:2877: AssertionError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines ______________ TestMasterRunner.test_spawn_correct_worker_indexes ______________ self = @@ -13824,14 +13760,14 @@ return self._worker_nodes[k] ~~~~~~~~~~~~~~~~~~^^^ KeyError: 'fake_client1' -2024-05-24T11:56:16Z >> failed with KeyError +2023-04-22T05:46:11Z >> failed with KeyError ------------------------------ Captured log call ------------------------------- ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client1). That's not going to work. ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client2). That's not going to work. ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (fake_client3). That's not going to work. WARNING locust.runners:runners.py:751 You can't start a distributed test before at least one worker processes has connected -CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> +CRITICAL locust.runners:log.py:74 Unhandled exception in greenlet: >> Traceback (most recent call last): File "src/gevent/greenlet.py", line 908, in gevent._gevent_cgreenlet.Greenlet.run File "/build/locust-2.12.1/locust/runners.py", line 1088, in client_listener @@ -13898,39 +13834,44 @@ INFO locust.runners:runners.py:507 Ramping to 1 users at a rate of 1.00 per second INFO locust.runners:runners.py:545 All users spawned: {"MyTestUser": 1} (1 total users) INFO locust.runners:runners.py:159 Resetting stats -______________________ TestStopTimeout.test_stop_timeout _______________________ +______________ TestStopTimeout.test_stop_timeout_during_on_start _______________ -self = +self = - def test_stop_timeout(self): + def test_stop_timeout_during_on_start(self): short_time = 0.05 class MyTaskSet(TaskSet): + finished_on_start = False + my_task_run = False + + def on_start(self): + gevent.sleep(short_time) + MyTaskSet.finished_on_start = True + @task def my_task(self): - MyTaskSet.state = "first" - gevent.sleep(short_time) - MyTaskSet.state = "second" # should only run when run time + stop_timeout is > short_time - gevent.sleep(short_time) - MyTaskSet.state = "third" # should only run when run time + stop_timeout is > short_time * 2 + MyTaskSet.my_task_run = True class MyTestUser(User): tasks = [MyTaskSet] - environment = Environment(user_classes=[MyTestUser]) + environment = create_environment([MyTestUser], mocked_options()) + environment.stop_timeout = short_time runner = environment.create_local_runner() - runner.start(1, 1, wait=False) + runner.start(1, 1) gevent.sleep(short_time / 2) runner.quit() -> self.assertEqual("first", MyTaskSet.state) -E AssertionError: 'first' != 'second' -E - first -E + second + + self.assertTrue(MyTaskSet.finished_on_start) +> self.assertFalse(MyTaskSet.my_task_run) +E AssertionError: True is not false -locust/test/test_runners.py:3770: AssertionError +locust/test/test_runners.py:3822: AssertionError ------------------------------ Captured log call ------------------------------- INFO locust.runners:runners.py:507 Ramping to 1 users at a rate of 1.00 per second INFO locust.runners:runners.py:545 All users spawned: {"MyTestUser": 1} (1 total users) +INFO locust.runners:runners.py:159 Resetting stats _______________ TestStopTimeout.test_stop_timeout_with_ramp_down _______________ self = @@ -13964,52 +13905,45 @@ runner.spawning_greenlet.join() delta = time.perf_counter() - ts > self.assertTrue(2 <= delta <= 2.05, f"Expected user count to decrease to 2 in 2s, instead it took {delta:f}") -E AssertionError: False is not true : Expected user count to decrease to 2 in 2s, instead it took 2.082472 +E AssertionError: False is not true : Expected user count to decrease to 2 in 2s, instead it took 2.079609 locust/test/test_runners.py:4033: AssertionError ------------------------------ Captured log call ------------------------------- INFO locust.runners:runners.py:507 Ramping to 10 users at a rate of 10.00 per second INFO locust.runners:runners.py:545 All users spawned: {"MyTestUser": 10} (10 total users) INFO locust.runners:runners.py:507 Ramping to 2 users at a rate of 4.00 per second +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines INFO locust.runners:runners.py:545 All users spawned: {"MyTestUser": 2} (2 total users) -_______________ TestCsvStats.test_csv_stats_writer_full_history ________________ - -self = - - @mock.patch("locust.stats.CSV_STATS_INTERVAL_SEC", new=_TEST_CSV_STATS_INTERVAL_SEC) - def test_csv_stats_writer_full_history(self): - stats_writer = StatsCSVFileWriter( - self.environment, PERCENTILES_TO_REPORT, self.STATS_BASE_NAME, full_history=True - ) - - for i in range(10): - self.runner.stats.log_request("GET", "/", 100, content_length=666) - - greenlet = gevent.spawn(stats_writer) - gevent.sleep(10) - - for i in range(10): - self.runner.stats.log_request("GET", "/", 10, content_length=666) - - gevent.sleep(5) - - gevent.sleep(_TEST_CSV_STATS_INTERVAL_WAIT_SEC) - gevent.kill(greenlet) - stats_writer.close_files() - - self.assertTrue(os.path.exists(self.STATS_FILENAME)) - self.assertTrue(os.path.exists(self.STATS_HISTORY_FILENAME)) - self.assertTrue(os.path.exists(self.STATS_FAILURES_FILENAME)) - self.assertTrue(os.path.exists(self.STATS_EXCEPTIONS_FILENAME)) - - with open(self.STATS_HISTORY_FILENAME) as f: - reader = csv.DictReader(f) - rows = [r for r in reader] - -> self.assertGreaterEqual(len(rows), 130) -E AssertionError: 126 not greater than or equal to 130 - -locust/test/test_stats.py:461: AssertionError _______________________ TestCsvStats.test_stats_history ________________________ self = @@ -14150,7 +14084,7 @@ self.connect_to_master() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -self = +self = def connect_to_master(self): self.retry += 1 @@ -14171,228 +14105,130 @@ locust/runners.py:1396: ConnectionError ------------------------------ Captured log call ------------------------------- -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. +WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 3/60. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 3/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 4/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 5/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 6/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 7/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 8/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 9/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 10/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 11/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 12/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 13/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 14/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 15/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 16/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 17/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 18/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 19/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 20/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 21/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 22/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 23/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 24/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 25/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines -WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 26/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 27/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 28/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 29/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 30/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 31/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 32/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 33/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 34/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 35/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 36/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 37/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 38/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 39/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 40/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 41/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 42/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 43/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 44/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 45/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 46/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 47/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 48/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 49/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 50/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 51/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 52/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 53/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 54/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 55/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 56/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 57/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 58/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 59/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 60/60. -ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (ionos15-amd64_1f072316f927448e954329c59323de95). That's not going to work. +ERROR locust.runners:runners.py:1022 An old (pre 2.0) worker tried to connect (i-capture-the-hostname_a8960928428349b685943050045e103f). That's not going to work. WARNING locust.runners:runners.py:1392 Failed to connect to master 127.0.0.1:5558, retry 61/60. ______________ TestCsvStats.test_user_count_in_csv_history_stats _______________ @@ -14431,12 +14267,40 @@ rows = [r for r in reader] > self.assertEqual(2 * user_count, len(rows)) -E AssertionError: 30 != 26 +E AssertionError: 30 != 28 locust/test/test_stats.py:547: AssertionError ------------------------------ Captured log call ------------------------------- INFO locust.runners:runners.py:507 Ramping to 15 users at a rate of 5.00 per second INFO locust.runners:runners.py:545 All users spawned: {"TestUser": 15} (15 total users) +____________________ TestWaitTime.test_constant_throughput _____________________ + +self = + + def test_constant_throughput(self): + class MyUser(User): + wait_time = constant_throughput(10) + + class TS(TaskSet): + pass + + ts = TS(MyUser(self.environment)) + + ts2 = TS(MyUser(self.environment)) + + previous_time = time.perf_counter() + for i in range(7): + ts.wait() + since_last_run = time.perf_counter() - previous_time +> self.assertLess(abs(0.1 - since_last_run), 0.02) +E AssertionError: 0.02725469900760799 not less than 0.02 + +locust/test/test_wait_time.py:75: AssertionError +------------------------------ Captured log call ------------------------------- +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines +WARNING root:runners.py:324 CPU usage above 90%! This may constrain your throughput and may even give inconsistent response time measurements! See https://docs.locust.io/en/stable/running-distributed.html for how to distribute the load over multiple CPU cores or machines =============================== warnings summary =============================== locust/test/test_fasthttp.py::TestFastHttpSsl::test_ssl_request_insecure locust/test/test_web.py::TestWebUIWithTLS::test_index_with_https @@ -14452,9 +14316,6 @@ -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ -FAILED locust/test/test_dispatch.py::TestLargeScale::test_distribute_users - ... -FAILED locust/test/test_dispatch.py::TestLargeScale::test_ramp_up_from_0_to_100_000_users_with_50_user_classes_and_1000_workers_and_5000_spawn_rate -FAILED locust/test/test_fasthttp.py::TestFastHttpUserClass::test_client_pool_concurrency FAILED locust/test/test_log.py::TestLoggingOptions::test_log_to_file - OSErro... FAILED locust/test/test_log.py::TestLoggingOptions::test_logging_output - OSE... FAILED locust/test/test_log.py::TestLoggingOptions::test_skip_logging - OSErr... @@ -14519,12 +14380,12 @@ FAILED locust/test/test_runners.py::TestMasterRunner::test_worker_missing_after_heartbeat_dead_interval FAILED locust/test/test_runners.py::TestMasterRunner::test_worker_sends_bad_message_to_master FAILED locust/test/test_runners.py::TestStopTimeout::test_kill_locusts_with_stop_timeout -FAILED locust/test/test_runners.py::TestStopTimeout::test_stop_timeout - Asse... +FAILED locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_during_on_start FAILED locust/test/test_runners.py::TestStopTimeout::test_stop_timeout_with_ramp_down -FAILED locust/test/test_stats.py::TestCsvStats::test_csv_stats_writer_full_history FAILED locust/test/test_stats.py::TestCsvStats::test_stats_history - Connecti... FAILED locust/test/test_stats.py::TestCsvStats::test_user_count_in_csv_history_stats -==== 72 failed, 430 passed, 15 deselected, 7 warnings in 710.42s (0:11:50) ===== +FAILED locust/test/test_wait_time.py::TestWaitTime::test_constant_throughput +==== 69 failed, 433 passed, 15 deselected, 7 warnings in 633.86s (0:10:33) ===== E: pybuild pybuild:388: test: plugin custom failed with: exit code=1: PYTHONPATH=/build/locust-2.12.1/.pybuild/cpython3_3.11_locust/build python3.11 -m pytest -v --ignore=examples/test_data_management.py -k 'not TestMasterWorkerRunners' dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.11 returned exit code 13 make[1]: [debian/rules:15: override_dh_auto_test] Error 25 (ignored) @@ -14565,12 +14426,14 @@ dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: including full source code in upload I: copying local configuration +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/B01_cleanup starting +I: user script /srv/workspace/pbuilder/3216951/tmp/hooks/B01_cleanup finished I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env -I: removing directory /srv/workspace/pbuilder/178780 and its subdirectories -I: Current time: Fri May 24 00:03:52 -12 2024 -I: pbuilder-time-stamp: 1716552232 +I: removing directory /srv/workspace/pbuilder/3216951 and its subdirectories +I: Current time: Sat Apr 22 19:53:02 +14 2023 +I: pbuilder-time-stamp: 1682142782