Wed Apr 12 12:24:59 UTC 2023 I: starting to build lua-cjson/bookworm/arm64 on jenkins on '2023-04-12 12:24' Wed Apr 12 12:24:59 UTC 2023 I: The jenkins build log is/was available at https://jenkins.debian.net/userContent/reproducible/debian/build_service/arm64_12/6093/console.log Wed Apr 12 12:25:00 UTC 2023 I: Downloading source for bookworm/lua-cjson=2.1.0+dfsg-2.2 --2023-04-12 12:25:00-- http://cdn-fastly.deb.debian.org/debian/pool/main/l/lua-cjson/lua-cjson_2.1.0%2bdfsg-2.2.dsc Connecting to 78.137.99.97:3128... connected. Proxy request sent, awaiting response... 200 OK Length: 2045 (2.0K) [text/prs.lines.tag] Saving to: ‘lua-cjson_2.1.0+dfsg-2.2.dsc’ 0K . 100% 57.2M=0s 2023-04-12 12:25:00 (57.2 MB/s) - ‘lua-cjson_2.1.0+dfsg-2.2.dsc’ saved [2045/2045] Wed Apr 12 12:25:00 UTC 2023 I: lua-cjson_2.1.0+dfsg-2.2.dsc -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 3.0 (quilt) Source: lua-cjson Binary: lua-cjson, lua-cjson-dev Architecture: any Version: 2.1.0+dfsg-2.2 Maintainer: The Debian Lua Team Uploaders: Dmitry E. Oboukhov Homepage: http://www.kyne.com.au/~mark/software/lua-cjson.php Standards-Version: 4.6.1 Vcs-Browser: https://salsa.debian.org/lua-team/lua-cjson Vcs-Git: https://salsa.debian.org/lua-team/lua-cjson.git Build-Depends: debhelper-compat (= 13), dh-lua Package-List: lua-cjson deb interpreters optional arch=any lua-cjson-dev deb libdevel optional arch=any Checksums-Sha1: 90bce831a3207ff0776f891e205e4d5628b6e059 76830 lua-cjson_2.1.0+dfsg.orig.tar.gz 7186aa18a68708c7988a8158d4d246c66032f70f 4192 lua-cjson_2.1.0+dfsg-2.2.debian.tar.xz Checksums-Sha256: 94f1fce36742cf00e99c5071b309b9f71e6e7daf60a7151f7bf4ac0a97259ac4 76830 lua-cjson_2.1.0+dfsg.orig.tar.gz 2b6fb37c00e0d1faf671b1e68f80478a7048fb49e356748a48e54996f12377cc 4192 lua-cjson_2.1.0+dfsg-2.2.debian.tar.xz Files: a3394f2eb1670ab3553e3f91eb748b2a 76830 lua-cjson_2.1.0+dfsg.orig.tar.gz 623c4ba369d313102b7a81a1f76ae1ca 4192 lua-cjson_2.1.0+dfsg-2.2.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEfncpR22H1vEdkazLwpPntGGCWs4FAmOI4+YACgkQwpPntGGC Ws4NrQ/8CR2idWMiR8JGklPWNkPd1amYBkBxQdUe1rEkwzt2DkJljKmU5wcDkOOd 50RafnHtXOYHylZ5LU0zgtR9bB507kkZfG9U2+doSyoWyeuQ1TiyI1FSPhmtNIWC C37geMNMuSt2vXlDCxPT+JgZtIlywcHhZDljumPzsg1mx5oPcAJ2bbHIlHz4HibZ nFjRoxFkLVD5wGCKtEAt7FKoRmguIY8690PXeTQ6ONCTYPAwYVuMNmkx7mmtcPDi YM6BHbhrUd5JhEtpjLZd3LfHmtc42eZbP/WPxvn+QEcgdZUdeZq+i1HlZjX9zoZ/ KZ3BPVHdzaA1Sh/qEwgXb3D2Cjcu6iqlGdrq8GCV4mO3nTCVJzkrtb/JZU6smauj EmIgxaBNLU9oZH/Xgu3Xj6raC10H+TYTtH/wIgohYUay5w66p6K1UWCaMsXnrp2H C0j3MgOMMKm5WnZgShyOJk68Z/g8q6+CkjNC+oLXXoEgpq8+2FyFGopf05LdKxW3 5gocrzn3XZfzFUlwLckyX3CAGk7C3K8sn4KFfSeM87gXbgEpRIdRqJpELCW3FhU5 YLERdAjFKbvMdMcFHPCEEdbIRTq8SILUBOM+azJcMOYsqo+lXc095jhJldoJXcjL vQ17ED40dO3C5ASU/l+KbDgEjDY2dURSjvoX9LkqtvJOxZZBOcw= =W3zn -----END PGP SIGNATURE----- Wed Apr 12 12:25:00 UTC 2023 I: Checking whether the package is not for us Wed Apr 12 12:25:00 UTC 2023 I: Starting 1st build on remote node codethink11-arm64.debian.net. Wed Apr 12 12:25:00 UTC 2023 I: Preparing to do remote build '1' on codethink11-arm64.debian.net. Wed Apr 12 12:27:24 UTC 2023 I: Deleting $TMPDIR on codethink11-arm64.debian.net. I: pbuilder: network access will be disabled during build I: Current time: Tue May 14 06:48:02 -12 2024 I: pbuilder-time-stamp: 1715712482 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/bookworm-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [lua-cjson_2.1.0+dfsg-2.2.dsc] I: copying [./lua-cjson_2.1.0+dfsg.orig.tar.gz] I: copying [./lua-cjson_2.1.0+dfsg-2.2.debian.tar.xz] I: Extracting source gpgv: Signature made Thu Dec 1 05:27:02 2022 -12 gpgv: using RSA key 7E7729476D87D6F11D91ACCBC293E7B461825ACE gpgv: Can't check signature: No public key dpkg-source: warning: cannot verify inline signature for ./lua-cjson_2.1.0+dfsg-2.2.dsc: no acceptable signature found dpkg-source: info: extracting lua-cjson in lua-cjson-2.1.0+dfsg dpkg-source: info: unpacking lua-cjson_2.1.0+dfsg.orig.tar.gz dpkg-source: info: unpacking lua-cjson_2.1.0+dfsg-2.2.debian.tar.xz dpkg-source: info: using patch list from debian/patches/series dpkg-source: info: applying disable-sparse-array-test dpkg-source: info: applying disable-utf16-test dpkg-source: info: applying lua5.2-function-names dpkg-source: info: applying lua5.3-unpack I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/20159/tmp/hooks/D02_print_environment starting I: set BUILDDIR='/build' BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' BUILDUSERNAME='pbuilder1' BUILD_ARCH='arm64' DEBIAN_FRONTEND='noninteractive' DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=8' DISTRIBUTION='bookworm' HOME='/var/lib/jenkins' HOST_ARCH='arm64' IFS=' ' LANG='C' LANGUAGE='en_US:en' LC_ALL='C' MAIL='/var/mail/root' OPTIND='1' PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' PBCURRENTCOMMANDLINEOPERATION='build' PBUILDER_OPERATION='build' PBUILDER_PKGDATADIR='/usr/share/pbuilder' PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' PBUILDER_SYSCONFDIR='/etc' PPID='20159' PS1='# ' PS2='> ' PS4='+ ' PWD='/' SHELL='/bin/bash' SHLVL='2' SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/pbuilderrc_awf7 --distribution bookworm --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/bookworm-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/b1 --logfile b1/build.log lua-cjson_2.1.0+dfsg-2.2.dsc' SUDO_GID='117' SUDO_UID='110' SUDO_USER='jenkins' TERM='unknown' TZ='/usr/share/zoneinfo/Etc/GMT+12' USER='root' USERNAME='root' _='/usr/bin/systemd-run' http_proxy='http://192.168.101.16:3128' I: uname -a Linux codethink11-arm64 4.15.0-208-generic #220-Ubuntu SMP Mon Mar 20 14:28:12 UTC 2023 aarch64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 May 12 04:50 /bin -> usr/bin I: user script /srv/workspace/pbuilder/20159/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: arm64 Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-lua dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19616 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-lua; however: Package dh-lua is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} dctrl-tools{a} debhelper{a} dh-autoreconf{a} dh-lua{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libdebhelper-perl{a} libelf1{a} libfile-stripnondeterminism-perl{a} libicu72{a} liblua5.1-0{a} liblua5.1-0-dev{a} liblua5.2-0{a} liblua5.2-dev{a} liblua5.3-0{a} liblua5.3-dev{a} liblua5.4-0{a} liblua5.4-dev{a} libmagic-mgc{a} libmagic1{a} libncurses-dev{a} libncurses6{a} libpipeline1{a} libpkgconf3{a} libreadline-dev{a} libreadline8{a} libsub-override-perl{a} libtool{a} libuchardet0{a} libxml2{a} lua5.1{a} lua5.2{a} lua5.3{a} lua5.4{a} m4{a} man-db{a} pkg-config{a} pkgconf{a} pkgconf-bin{a} po-debconf{a} readline-common{a} sensible-utils{a} The following packages are RECOMMENDED but will NOT be installed: curl libarchive-cpio-perl libgpm2 libltdl-dev libmail-sendmail-perl libtool-bin lynx wget 0 packages upgraded, 53 newly installed, 0 to remove and 0 not upgraded. Need to get 20.8 MB of archives. After unpacking 87.5 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian bookworm/main arm64 readline-common all 8.2-1.3 [69.0 kB] Get: 2 http://deb.debian.org/debian bookworm/main arm64 sensible-utils all 0.0.17+nmu1 [19.0 kB] Get: 3 http://deb.debian.org/debian bookworm/main arm64 libmagic-mgc arm64 1:5.44-3 [305 kB] Get: 4 http://deb.debian.org/debian bookworm/main arm64 libmagic1 arm64 1:5.44-3 [98.5 kB] Get: 5 http://deb.debian.org/debian bookworm/main arm64 file arm64 1:5.44-3 [42.5 kB] Get: 6 http://deb.debian.org/debian bookworm/main arm64 gettext-base arm64 0.21-12 [159 kB] Get: 7 http://deb.debian.org/debian bookworm/main arm64 libuchardet0 arm64 0.0.7-1 [67.9 kB] Get: 8 http://deb.debian.org/debian bookworm/main arm64 groff-base arm64 1.22.4-10 [861 kB] Get: 9 http://deb.debian.org/debian bookworm/main arm64 bsdextrautils arm64 2.38.1-5+b1 [86.9 kB] Get: 10 http://deb.debian.org/debian bookworm/main arm64 libpipeline1 arm64 1.5.7-1 [36.4 kB] Get: 11 http://deb.debian.org/debian bookworm/main arm64 man-db arm64 2.11.2-2 [1369 kB] Get: 12 http://deb.debian.org/debian bookworm/main arm64 m4 arm64 1.4.19-3 [276 kB] Get: 13 http://deb.debian.org/debian bookworm/main arm64 autoconf all 2.71-3 [332 kB] Get: 14 http://deb.debian.org/debian bookworm/main arm64 autotools-dev all 20220109.1 [51.6 kB] Get: 15 http://deb.debian.org/debian bookworm/main arm64 automake all 1:1.16.5-1.3 [823 kB] Get: 16 http://deb.debian.org/debian bookworm/main arm64 autopoint all 0.21-12 [495 kB] Get: 17 http://deb.debian.org/debian bookworm/main arm64 dctrl-tools arm64 2.24-3 [101 kB] Get: 18 http://deb.debian.org/debian bookworm/main arm64 libdebhelper-perl all 13.11.4 [81.2 kB] Get: 19 http://deb.debian.org/debian bookworm/main arm64 libtool all 2.4.7-5 [517 kB] Get: 20 http://deb.debian.org/debian bookworm/main arm64 dh-autoreconf all 20 [17.1 kB] Get: 21 http://deb.debian.org/debian bookworm/main arm64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 22 http://deb.debian.org/debian bookworm/main arm64 libsub-override-perl all 0.09-4 [9304 B] Get: 23 http://deb.debian.org/debian bookworm/main arm64 libfile-stripnondeterminism-perl all 1.13.1-1 [19.4 kB] Get: 24 http://deb.debian.org/debian bookworm/main arm64 dh-strip-nondeterminism all 1.13.1-1 [8620 B] Get: 25 http://deb.debian.org/debian bookworm/main arm64 libelf1 arm64 0.188-2.1 [173 kB] Get: 26 http://deb.debian.org/debian bookworm/main arm64 dwz arm64 0.15-1 [101 kB] Get: 27 http://deb.debian.org/debian bookworm/main arm64 libicu72 arm64 72.1-3 [9204 kB] Get: 28 http://deb.debian.org/debian bookworm/main arm64 libxml2 arm64 2.9.14+dfsg-1.1+b3 [619 kB] Get: 29 http://deb.debian.org/debian bookworm/main arm64 gettext arm64 0.21-12 [1248 kB] Get: 30 http://deb.debian.org/debian bookworm/main arm64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 31 http://deb.debian.org/debian bookworm/main arm64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 32 http://deb.debian.org/debian bookworm/main arm64 debhelper all 13.11.4 [942 kB] Get: 33 http://deb.debian.org/debian bookworm/main arm64 libpkgconf3 arm64 1.8.1-1 [35.3 kB] Get: 34 http://deb.debian.org/debian bookworm/main arm64 pkgconf-bin arm64 1.8.1-1 [28.9 kB] Get: 35 http://deb.debian.org/debian bookworm/main arm64 pkgconf arm64 1.8.1-1 [25.9 kB] Get: 36 http://deb.debian.org/debian bookworm/main arm64 pkg-config arm64 1.8.1-1 [13.7 kB] Get: 37 http://deb.debian.org/debian bookworm/main arm64 liblua5.4-0 arm64 5.4.4-3 [124 kB] Get: 38 http://deb.debian.org/debian bookworm/main arm64 libreadline8 arm64 8.2-1.3 [155 kB] Get: 39 http://deb.debian.org/debian bookworm/main arm64 libncurses6 arm64 6.4-2 [94.0 kB] Get: 40 http://deb.debian.org/debian bookworm/main arm64 libncurses-dev arm64 6.4-2 [335 kB] Get: 41 http://deb.debian.org/debian bookworm/main arm64 libreadline-dev arm64 8.2-1.3 [151 kB] Get: 42 http://deb.debian.org/debian bookworm/main arm64 liblua5.4-dev arm64 5.4.4-3 [154 kB] Get: 43 http://deb.debian.org/debian bookworm/main arm64 lua5.4 arm64 5.4.4-3 [119 kB] Get: 44 http://deb.debian.org/debian bookworm/main arm64 liblua5.3-0 arm64 5.3.6-2 [110 kB] Get: 45 http://deb.debian.org/debian bookworm/main arm64 liblua5.3-dev arm64 5.3.6-2 [141 kB] Get: 46 http://deb.debian.org/debian bookworm/main arm64 lua5.3 arm64 5.3.6-2 [105 kB] Get: 47 http://deb.debian.org/debian bookworm/main arm64 liblua5.2-0 arm64 5.2.4-3 [102 kB] Get: 48 http://deb.debian.org/debian bookworm/main arm64 liblua5.2-dev arm64 5.2.4-3 [129 kB] Get: 49 http://deb.debian.org/debian bookworm/main arm64 lua5.2 arm64 5.2.4-3 [95.4 kB] Get: 50 http://deb.debian.org/debian bookworm/main arm64 liblua5.1-0 arm64 5.1.5-9 [104 kB] Get: 51 http://deb.debian.org/debian bookworm/main arm64 liblua5.1-0-dev arm64 5.1.5-9 [128 kB] Get: 52 http://deb.debian.org/debian bookworm/main arm64 lua5.1 arm64 5.1.5-9 [98.1 kB] Get: 53 http://deb.debian.org/debian bookworm/main arm64 dh-lua all 29 [30.0 kB] Fetched 20.8 MB in 0s (46.6 MB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package readline-common. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19616 files and directories currently installed.) Preparing to unpack .../00-readline-common_8.2-1.3_all.deb ... Unpacking readline-common (8.2-1.3) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.17+nmu1_all.deb ... Unpacking sensible-utils (0.0.17+nmu1) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.44-3_arm64.deb ... Unpacking libmagic-mgc (1:5.44-3) ... Selecting previously unselected package libmagic1:arm64. Preparing to unpack .../03-libmagic1_1%3a5.44-3_arm64.deb ... Unpacking libmagic1:arm64 (1:5.44-3) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.44-3_arm64.deb ... Unpacking file (1:5.44-3) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.21-12_arm64.deb ... Unpacking gettext-base (0.21-12) ... Selecting previously unselected package libuchardet0:arm64. Preparing to unpack .../06-libuchardet0_0.0.7-1_arm64.deb ... Unpacking libuchardet0:arm64 (0.0.7-1) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.22.4-10_arm64.deb ... Unpacking groff-base (1.22.4-10) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.38.1-5+b1_arm64.deb ... Unpacking bsdextrautils (2.38.1-5+b1) ... Selecting previously unselected package libpipeline1:arm64. Preparing to unpack .../09-libpipeline1_1.5.7-1_arm64.deb ... Unpacking libpipeline1:arm64 (1.5.7-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.11.2-2_arm64.deb ... Unpacking man-db (2.11.2-2) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.19-3_arm64.deb ... Unpacking m4 (1.4.19-3) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.71-3_all.deb ... Unpacking autoconf (2.71-3) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20220109.1_all.deb ... Unpacking autotools-dev (20220109.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.16.5-1.3_all.deb ... Unpacking automake (1:1.16.5-1.3) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.21-12_all.deb ... Unpacking autopoint (0.21-12) ... Selecting previously unselected package dctrl-tools. Preparing to unpack .../16-dctrl-tools_2.24-3_arm64.deb ... Unpacking dctrl-tools (2.24-3) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../17-libdebhelper-perl_13.11.4_all.deb ... Unpacking libdebhelper-perl (13.11.4) ... Selecting previously unselected package libtool. Preparing to unpack .../18-libtool_2.4.7-5_all.deb ... Unpacking libtool (2.4.7-5) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../19-dh-autoreconf_20_all.deb ... Unpacking dh-autoreconf (20) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../20-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libsub-override-perl. Preparing to unpack .../21-libsub-override-perl_0.09-4_all.deb ... Unpacking libsub-override-perl (0.09-4) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../22-libfile-stripnondeterminism-perl_1.13.1-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.13.1-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../23-dh-strip-nondeterminism_1.13.1-1_all.deb ... Unpacking dh-strip-nondeterminism (1.13.1-1) ... Selecting previously unselected package libelf1:arm64. Preparing to unpack .../24-libelf1_0.188-2.1_arm64.deb ... Unpacking libelf1:arm64 (0.188-2.1) ... Selecting previously unselected package dwz. Preparing to unpack .../25-dwz_0.15-1_arm64.deb ... Unpacking dwz (0.15-1) ... Selecting previously unselected package libicu72:arm64. Preparing to unpack .../26-libicu72_72.1-3_arm64.deb ... Unpacking libicu72:arm64 (72.1-3) ... Selecting previously unselected package libxml2:arm64. Preparing to unpack .../27-libxml2_2.9.14+dfsg-1.1+b3_arm64.deb ... Unpacking libxml2:arm64 (2.9.14+dfsg-1.1+b3) ... Selecting previously unselected package gettext. Preparing to unpack .../28-gettext_0.21-12_arm64.deb ... Unpacking gettext (0.21-12) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../29-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../30-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../31-debhelper_13.11.4_all.deb ... Unpacking debhelper (13.11.4) ... Selecting previously unselected package libpkgconf3:arm64. Preparing to unpack .../32-libpkgconf3_1.8.1-1_arm64.deb ... Unpacking libpkgconf3:arm64 (1.8.1-1) ... Selecting previously unselected package pkgconf-bin. Preparing to unpack .../33-pkgconf-bin_1.8.1-1_arm64.deb ... Unpacking pkgconf-bin (1.8.1-1) ... Selecting previously unselected package pkgconf:arm64. Preparing to unpack .../34-pkgconf_1.8.1-1_arm64.deb ... Unpacking pkgconf:arm64 (1.8.1-1) ... Selecting previously unselected package pkg-config:arm64. Preparing to unpack .../35-pkg-config_1.8.1-1_arm64.deb ... Unpacking pkg-config:arm64 (1.8.1-1) ... Selecting previously unselected package liblua5.4-0:arm64. Preparing to unpack .../36-liblua5.4-0_5.4.4-3_arm64.deb ... Unpacking liblua5.4-0:arm64 (5.4.4-3) ... Selecting previously unselected package libreadline8:arm64. Preparing to unpack .../37-libreadline8_8.2-1.3_arm64.deb ... Unpacking libreadline8:arm64 (8.2-1.3) ... Selecting previously unselected package libncurses6:arm64. Preparing to unpack .../38-libncurses6_6.4-2_arm64.deb ... Unpacking libncurses6:arm64 (6.4-2) ... Selecting previously unselected package libncurses-dev:arm64. Preparing to unpack .../39-libncurses-dev_6.4-2_arm64.deb ... Unpacking libncurses-dev:arm64 (6.4-2) ... Selecting previously unselected package libreadline-dev:arm64. Preparing to unpack .../40-libreadline-dev_8.2-1.3_arm64.deb ... Unpacking libreadline-dev:arm64 (8.2-1.3) ... Selecting previously unselected package liblua5.4-dev:arm64. Preparing to unpack .../41-liblua5.4-dev_5.4.4-3_arm64.deb ... Unpacking liblua5.4-dev:arm64 (5.4.4-3) ... Selecting previously unselected package lua5.4. Preparing to unpack .../42-lua5.4_5.4.4-3_arm64.deb ... Unpacking lua5.4 (5.4.4-3) ... Selecting previously unselected package liblua5.3-0:arm64. Preparing to unpack .../43-liblua5.3-0_5.3.6-2_arm64.deb ... Unpacking liblua5.3-0:arm64 (5.3.6-2) ... Selecting previously unselected package liblua5.3-dev:arm64. Preparing to unpack .../44-liblua5.3-dev_5.3.6-2_arm64.deb ... Unpacking liblua5.3-dev:arm64 (5.3.6-2) ... Selecting previously unselected package lua5.3. Preparing to unpack .../45-lua5.3_5.3.6-2_arm64.deb ... Unpacking lua5.3 (5.3.6-2) ... Selecting previously unselected package liblua5.2-0:arm64. Preparing to unpack .../46-liblua5.2-0_5.2.4-3_arm64.deb ... Unpacking liblua5.2-0:arm64 (5.2.4-3) ... Selecting previously unselected package liblua5.2-dev:arm64. Preparing to unpack .../47-liblua5.2-dev_5.2.4-3_arm64.deb ... Unpacking liblua5.2-dev:arm64 (5.2.4-3) ... Selecting previously unselected package lua5.2. Preparing to unpack .../48-lua5.2_5.2.4-3_arm64.deb ... Unpacking lua5.2 (5.2.4-3) ... Selecting previously unselected package liblua5.1-0:arm64. Preparing to unpack .../49-liblua5.1-0_5.1.5-9_arm64.deb ... Unpacking liblua5.1-0:arm64 (5.1.5-9) ... Selecting previously unselected package liblua5.1-0-dev:arm64. Preparing to unpack .../50-liblua5.1-0-dev_5.1.5-9_arm64.deb ... Unpacking liblua5.1-0-dev:arm64 (5.1.5-9) ... Selecting previously unselected package lua5.1. Preparing to unpack .../51-lua5.1_5.1.5-9_arm64.deb ... Unpacking lua5.1 (5.1.5-9) ... Selecting previously unselected package dh-lua. Preparing to unpack .../52-dh-lua_29_all.deb ... Unpacking dh-lua (29) ... Setting up libpipeline1:arm64 (1.5.7-1) ... Setting up libicu72:arm64 (72.1-3) ... Setting up bsdextrautils (2.38.1-5+b1) ... Setting up libmagic-mgc (1:5.44-3) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libdebhelper-perl (13.11.4) ... Setting up libmagic1:arm64 (1:5.44-3) ... Setting up gettext-base (0.21-12) ... Setting up m4 (1.4.19-3) ... Setting up file (1:5.44-3) ... Setting up autotools-dev (20220109.1) ... Setting up libpkgconf3:arm64 (1.8.1-1) ... Setting up libncurses6:arm64 (6.4-2) ... Setting up autopoint (0.21-12) ... Setting up pkgconf-bin (1.8.1-1) ... Setting up autoconf (2.71-3) ... Setting up liblua5.2-0:arm64 (5.2.4-3) ... Setting up sensible-utils (0.0.17+nmu1) ... Setting up libuchardet0:arm64 (0.0.7-1) ... Setting up liblua5.3-0:arm64 (5.3.6-2) ... Setting up liblua5.1-0:arm64 (5.1.5-9) ... Setting up liblua5.4-0:arm64 (5.4.4-3) ... Setting up libsub-override-perl (0.09-4) ... Setting up libelf1:arm64 (0.188-2.1) ... Setting up readline-common (8.2-1.3) ... Setting up libxml2:arm64 (2.9.14+dfsg-1.1+b3) ... Setting up dctrl-tools (2.24-3) ... Setting up automake (1:1.16.5-1.3) ... update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.13.1-1) ... Setting up libncurses-dev:arm64 (6.4-2) ... Setting up gettext (0.21-12) ... Setting up libtool (2.4.7-5) ... Setting up libreadline8:arm64 (8.2-1.3) ... Setting up lua5.3 (5.3.6-2) ... update-alternatives: using /usr/bin/lua5.3 to provide /usr/bin/lua (lua-interpreter) in auto mode update-alternatives: using /usr/bin/luac5.3 to provide /usr/bin/luac (lua-compiler) in auto mode Setting up libreadline-dev:arm64 (8.2-1.3) ... Setting up lua5.1 (5.1.5-9) ... Setting up liblua5.2-dev:arm64 (5.2.4-3) ... Setting up liblua5.4-dev:arm64 (5.4.4-3) ... update-alternatives: using /usr/lib/aarch64-linux-gnu/pkgconfig/lua5.4.pc to provide /usr/lib/aarch64-linux-gnu/pkgconfig/lua.pc (lua-pkgconfig-aarch64-linux-gnu) in auto mode Setting up pkgconf:arm64 (1.8.1-1) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (20) ... Setting up pkg-config:arm64 (1.8.1-1) ... Setting up dh-strip-nondeterminism (1.13.1-1) ... Setting up dwz (0.15-1) ... Setting up groff-base (1.22.4-10) ... Setting up lua5.4 (5.4.4-3) ... Setting up liblua5.3-dev:arm64 (5.3.6-2) ... Setting up liblua5.1-0-dev:arm64 (5.1.5-9) ... Setting up lua5.2 (5.2.4-3) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up man-db (2.11.2-2) ... Not building database; man-db/auto-update is not 'true'. Setting up debhelper (13.11.4) ... Setting up dh-lua (29) ... Processing triggers for libc-bin (2.36-8) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: Running cd /build/lua-cjson-2.1.0+dfsg/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../lua-cjson_2.1.0+dfsg-2.2_source.changes dpkg-buildpackage: info: source package lua-cjson dpkg-buildpackage: info: source version 2.1.0+dfsg-2.2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Yangfl dpkg-source --before-build . dpkg-buildpackage: info: host architecture arm64 debian/rules clean dh clean --buildsystem=lua --with lua dh_auto_clean -O--buildsystem=lua make --no-print-directory -f /usr/share/dh-lua/make/dh-lua.Makefile.multiple clean Making target clean for debian/lua5.1.dh-lua.conf # fix for leftovers of dh-lua < 14 Target clean made Making target clean for debian/lua5.2.dh-lua.conf # fix for leftovers of dh-lua < 14 Target clean made Making target clean for debian/lua5.3.dh-lua.conf # fix for leftovers of dh-lua < 14 Target clean made Making target clean for debian/lua5.4.dh-lua.conf # fix for leftovers of dh-lua < 14 Target clean made dh_autoreconf_clean -O--buildsystem=lua dh_clean -O--buildsystem=lua debian/rules execute_after_dh_clean make[1]: Entering directory '/build/lua-cjson-2.1.0+dfsg' rm -f -f debian/trash make[1]: Leaving directory '/build/lua-cjson-2.1.0+dfsg' debian/rules binary dh binary --buildsystem=lua --with lua dh_update_autotools_config -O--buildsystem=lua dh_autoreconf -O--buildsystem=lua dh_auto_configure -O--buildsystem=lua make --no-print-directory -f /usr/share/dh-lua/make/dh-lua.Makefile.multiple configure Making target configure for debian/lua5.1.dh-lua.conf # .install Filling in debian/lua-cjson.install using /usr/share/dh-lua/template/lib.install.in Adding new line: usr/lib/aarch64-linux-gnu/lua/5.1/cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.* Adding new line: usr/share/lua/5.1/cjson/util.lua Filling in debian/lua-cjson-dev.install using /usr/share/dh-lua/template/dev.install.in Adding new line: usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.1-cjson.a Adding new line: usr/lib/aarch64-linux-gnu/pkgconfig/lua5.1-cjson.pc Adding new line: usr/include/lua5.1/lua-cjson.h # lua_versions Filling in debian/lua_versions Adding new line: 5.1 Target configure made Making target configure for debian/lua5.2.dh-lua.conf # .install Filling in debian/lua-cjson.install using /usr/share/dh-lua/template/lib.install.in Adding new line: usr/lib/aarch64-linux-gnu/lua/5.2/cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.* Adding new line: usr/share/lua/5.2/cjson/util.lua Filling in debian/lua-cjson-dev.install using /usr/share/dh-lua/template/dev.install.in Adding new line: usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.2-cjson.a Adding new line: usr/lib/aarch64-linux-gnu/pkgconfig/lua5.2-cjson.pc Adding new line: usr/include/lua5.2/lua-cjson.h # lua_versions Filling in debian/lua_versions Adding new line: 5.2 Target configure made Making target configure for debian/lua5.3.dh-lua.conf # .install Filling in debian/lua-cjson.install using /usr/share/dh-lua/template/lib.install.in Adding new line: usr/lib/aarch64-linux-gnu/lua/5.3/cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.* Adding new line: usr/share/lua/5.3/cjson/util.lua Filling in debian/lua-cjson-dev.install using /usr/share/dh-lua/template/dev.install.in Adding new line: usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.3-cjson.a Adding new line: usr/lib/aarch64-linux-gnu/pkgconfig/lua5.3-cjson.pc Adding new line: usr/include/lua5.3/lua-cjson.h # lua_versions Filling in debian/lua_versions Adding new line: 5.3 Target configure made Making target configure for debian/lua5.4.dh-lua.conf # .install Filling in debian/lua-cjson.install using /usr/share/dh-lua/template/lib.install.in Adding new line: usr/lib/aarch64-linux-gnu/lua/5.4/cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.* Adding new line: usr/share/lua/5.4/cjson/util.lua Filling in debian/lua-cjson-dev.install using /usr/share/dh-lua/template/dev.install.in Adding new line: usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so Adding new line: usr/lib/aarch64-linux-gnu/liblua5.4-cjson.a Adding new line: usr/lib/aarch64-linux-gnu/pkgconfig/lua5.4-cjson.pc Adding new line: usr/include/lua5.4/lua-cjson.h # lua_versions Filling in debian/lua_versions Adding new line: 5.4 Target configure made dh_auto_build -O--buildsystem=lua make --no-print-directory -f /usr/share/dh-lua/make/dh-lua.Makefile.multiple build Making target build for debian/lua5.1.dh-lua.conf libtoolize: putting auxiliary files in '.'. libtoolize: copying file './config.guess' libtoolize: copying file './config.sub' libtoolize: copying file './install-sh' libtoolize: copying file './ltmain.sh' libtoolize: Consider adding 'AC_CONFIG_MACRO_DIRS([m4])' to configure.ac, libtoolize: and rerunning libtoolize and aclocal. libtoolize: Consider adding '-I m4' to ACLOCAL_AMFLAGS in Makefile.am. cd debian/.dh_lua-libtool && ./configure --build=aarch64-linux-gnu --prefix=/usr --includedir=\${prefix}/include --mandir=\${prefix}/share/man --infodir=\${prefix}/share/info --sysconfdir=/etc --localstatedir=/var --disable-option-checking --disable-silent-rules --libdir=\${prefix}/lib/aarch64-linux-gnu --runstatedir=/run --disable-maintainer-mode --disable-dependency-tracking CFLAGS= LDFLAGS= LDFLAGS_STATIC= checking build system type... aarch64-unknown-linux-gnu checking host system type... aarch64-unknown-linux-gnu checking how to print strings... printf checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether the compiler supports GNU C... yes checking whether gcc accepts -g... yes checking for gcc option to enable C11 features... none needed checking for a sed that does not truncate output... /usr/bin/sed checking for grep that handles long lines and -e... /usr/bin/grep checking for egrep... /usr/bin/grep -E checking for fgrep... /usr/bin/grep -F checking for ld used by gcc... /usr/bin/ld checking if the linker (/usr/bin/ld) is GNU ld... yes checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B checking the name lister (/usr/bin/nm -B) interface... BSD nm checking whether ln -s works... yes checking the maximum length of command line arguments... 1572864 checking how to convert aarch64-unknown-linux-gnu file names to aarch64-unknown-linux-gnu format... func_convert_file_noop checking how to convert aarch64-unknown-linux-gnu file names to toolchain format... func_convert_file_noop checking for /usr/bin/ld option to reload object files... -r checking for file... file checking for objdump... objdump checking how to recognize dependent libraries... pass_all checking for dlltool... no checking how to associate runtime and link libraries... printf %s\n checking for ar... ar checking for archiver @FILE support... @ checking for strip... strip checking for ranlib... ranlib checking for gawk... no checking for mawk... mawk checking command to parse /usr/bin/nm -B output from gcc object... ok checking for sysroot... no checking for a working dd... /usr/bin/dd checking how to truncate binary pipes... /usr/bin/dd bs=4096 count=1 checking for mt... no checking if : is a manifest tool... no checking for stdio.h... yes checking for stdlib.h... yes checking for string.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for strings.h... yes checking for sys/stat.h... yes checking for sys/types.h... yes checking for unistd.h... yes checking for dlfcn.h... yes checking for objdir... .libs checking if gcc supports -fno-rtti -fno-exceptions... no checking for gcc option to produce PIC... -fPIC -DPIC checking if gcc PIC flag -fPIC -DPIC works... yes checking if gcc static flag -static works... yes checking if gcc supports -c -o file.o... yes checking if gcc supports -c -o file.o... (cached) yes checking whether the gcc linker (/usr/bin/ld) supports shared libraries... yes checking whether -lc should be explicitly linked in... no checking dynamic linker characteristics... GNU/Linux ld.so checking how to hardcode library paths into programs... immediate checking whether stripping libraries is possible... yes checking if libtool supports shared libraries... yes checking whether to build shared libraries... yes checking whether to build static libraries... yes configure: creating ./config.status config.status: executing libtool commands /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua_cjson.lo lua_cjson.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra lua_cjson.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/lua_cjson.o lua_cjson.c: In function 'json_append_string': lua_cjson.c:477:19: warning: comparison of integer expressions of different signedness: 'int' and 'size_t' {aka 'long unsigned int'} [-Wsign-compare] 477 | for (i = 0; i < len; i++) { | ^ In file included from lua_cjson.c:47: fpconv.h: At top level: fpconv.h:15:20: warning: inline function 'fpconv_init' declared but never defined 15 | extern inline void fpconv_init(); | ^~~~~~~~~~~ lua_cjson.c: In function 'json_append_data': lua_cjson.c:689:12: warning: this statement may fall through [-Wimplicit-fallthrough=] 689 | if (lua_touserdata(l, -1) == NULL) { | ^ lua_cjson.c:693:5: note: here 693 | default: | ^~~~~~~ libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra lua_cjson.c -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua_cjson.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/fpconv.lo fpconv.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra fpconv.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/fpconv.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra fpconv.c -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/fpconv.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/strbuf.lo strbuf.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra strbuf.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/strbuf.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra strbuf.c -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/strbuf.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc \ -rpath /usr//lib/aarch64-linux-gnu -version-info 0:0:0 -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/liblua5.1-cjson.la \ /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua_cjson.lo /build/lua-cjson-2.1.0+dfsg/5.1-cjson/fpconv.lo /build/lua-cjson-2.1.0+dfsg/5.1-cjson/strbuf.lo \ -Wl,-z,relro -Wl,-z,now libtool: link: gcc -shared -fPIC -DPIC /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/strbuf.o -Wl,--no-add-needed -Wl,-z -Wl,relro -Wl,-z -Wl,now -Wl,-soname -Wl,liblua5.1-cjson.so.0 -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.so.0.0.0 libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs" && rm -f "liblua5.1-cjson.so.0" && ln -s "liblua5.1-cjson.so.0.0.0" "liblua5.1-cjson.so.0") libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs" && rm -f "liblua5.1-cjson.so" && ln -s "liblua5.1-cjson.so.0.0.0" "liblua5.1-cjson.so") libtool: link: ar cr /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.a /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/strbuf.o libtool: link: ranlib /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.a libtool: link: ( cd "/build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs" && rm -f "liblua5.1-cjson.la" && ln -s "../liblua5.1-cjson.la" "liblua5.1-cjson.la" ) ldd /build/lua-cjson-2.1.0+dfsg/5.1-cjson/cjson.so linux-vdso.so.1 (0x0000ffff90bd3000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff909c0000) /lib/ld-linux-aarch64.so.1 (0x0000ffff90b96000) Target build made Making target build for debian/lua5.2.dh-lua.conf /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua_cjson.lo lua_cjson.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra lua_cjson.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/lua_cjson.o lua_cjson.c: In function 'json_append_string': lua_cjson.c:477:19: warning: comparison of integer expressions of different signedness: 'int' and 'size_t' {aka 'long unsigned int'} [-Wsign-compare] 477 | for (i = 0; i < len; i++) { | ^ In file included from lua_cjson.c:47: fpconv.h: At top level: fpconv.h:15:20: warning: inline function 'fpconv_init' declared but never defined 15 | extern inline void fpconv_init(); | ^~~~~~~~~~~ lua_cjson.c: In function 'json_append_data': lua_cjson.c:689:12: warning: this statement may fall through [-Wimplicit-fallthrough=] 689 | if (lua_touserdata(l, -1) == NULL) { | ^ lua_cjson.c:693:5: note: here 693 | default: | ^~~~~~~ libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra lua_cjson.c -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua_cjson.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/fpconv.lo fpconv.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra fpconv.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/fpconv.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra fpconv.c -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/fpconv.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/strbuf.lo strbuf.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra strbuf.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/strbuf.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra strbuf.c -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/strbuf.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc \ -rpath /usr//lib/aarch64-linux-gnu -version-info 0:0:0 -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/liblua5.2-cjson.la \ /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua_cjson.lo /build/lua-cjson-2.1.0+dfsg/5.2-cjson/fpconv.lo /build/lua-cjson-2.1.0+dfsg/5.2-cjson/strbuf.lo \ -Wl,-z,relro -Wl,-z,now libtool: link: gcc -shared -fPIC -DPIC /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/strbuf.o -Wl,--no-add-needed -Wl,-z -Wl,relro -Wl,-z -Wl,now -Wl,-soname -Wl,liblua5.2-cjson.so.0 -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.so.0.0.0 libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs" && rm -f "liblua5.2-cjson.so.0" && ln -s "liblua5.2-cjson.so.0.0.0" "liblua5.2-cjson.so.0") libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs" && rm -f "liblua5.2-cjson.so" && ln -s "liblua5.2-cjson.so.0.0.0" "liblua5.2-cjson.so") libtool: link: ar cr /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.a /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/strbuf.o libtool: link: ranlib /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.a libtool: link: ( cd "/build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs" && rm -f "liblua5.2-cjson.la" && ln -s "../liblua5.2-cjson.la" "liblua5.2-cjson.la" ) ldd /build/lua-cjson-2.1.0+dfsg/5.2-cjson/cjson.so linux-vdso.so.1 (0x0000ffffb428f000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffb4080000) /lib/ld-linux-aarch64.so.1 (0x0000ffffb4252000) Target build made Making target build for debian/lua5.3.dh-lua.conf /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua_cjson.lo lua_cjson.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra lua_cjson.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/lua_cjson.o lua_cjson.c: In function 'json_append_string': lua_cjson.c:477:19: warning: comparison of integer expressions of different signedness: 'int' and 'size_t' {aka 'long unsigned int'} [-Wsign-compare] 477 | for (i = 0; i < len; i++) { | ^ In file included from lua_cjson.c:47: fpconv.h: At top level: fpconv.h:15:20: warning: inline function 'fpconv_init' declared but never defined 15 | extern inline void fpconv_init(); | ^~~~~~~~~~~ lua_cjson.c: In function 'json_append_data': lua_cjson.c:689:12: warning: this statement may fall through [-Wimplicit-fallthrough=] 689 | if (lua_touserdata(l, -1) == NULL) { | ^ lua_cjson.c:693:5: note: here 693 | default: | ^~~~~~~ libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra lua_cjson.c -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua_cjson.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/fpconv.lo fpconv.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra fpconv.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/fpconv.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra fpconv.c -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/fpconv.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/strbuf.lo strbuf.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra strbuf.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/strbuf.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra strbuf.c -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/strbuf.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc \ -rpath /usr//lib/aarch64-linux-gnu -version-info 0:0:0 -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/liblua5.3-cjson.la \ /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua_cjson.lo /build/lua-cjson-2.1.0+dfsg/5.3-cjson/fpconv.lo /build/lua-cjson-2.1.0+dfsg/5.3-cjson/strbuf.lo \ -Wl,-z,relro -Wl,-z,now libtool: link: gcc -shared -fPIC -DPIC /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/strbuf.o -Wl,--no-add-needed -Wl,-z -Wl,relro -Wl,-z -Wl,now -Wl,-soname -Wl,liblua5.3-cjson.so.0 -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.so.0.0.0 libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs" && rm -f "liblua5.3-cjson.so.0" && ln -s "liblua5.3-cjson.so.0.0.0" "liblua5.3-cjson.so.0") libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs" && rm -f "liblua5.3-cjson.so" && ln -s "liblua5.3-cjson.so.0.0.0" "liblua5.3-cjson.so") libtool: link: ar cr /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.a /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/strbuf.o libtool: link: ranlib /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.a libtool: link: ( cd "/build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs" && rm -f "liblua5.3-cjson.la" && ln -s "../liblua5.3-cjson.la" "liblua5.3-cjson.la" ) ldd /build/lua-cjson-2.1.0+dfsg/5.3-cjson/cjson.so linux-vdso.so.1 (0x0000ffff87a08000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff877f0000) /lib/ld-linux-aarch64.so.1 (0x0000ffff879cb000) Target build made Making target build for debian/lua5.4.dh-lua.conf /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua_cjson.lo lua_cjson.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra lua_cjson.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/lua_cjson.o lua_cjson.c: In function 'json_append_string': lua_cjson.c:477:19: warning: comparison of integer expressions of different signedness: 'int' and 'size_t' {aka 'long unsigned int'} [-Wsign-compare] 477 | for (i = 0; i < len; i++) { | ^ In file included from lua_cjson.c:47: fpconv.h: At top level: fpconv.h:15:20: warning: inline function 'fpconv_init' declared but never defined 15 | extern inline void fpconv_init(); | ^~~~~~~~~~~ lua_cjson.c: In function 'json_append_data': lua_cjson.c:689:12: warning: this statement may fall through [-Wimplicit-fallthrough=] 689 | if (lua_touserdata(l, -1) == NULL) { | ^ lua_cjson.c:693:5: note: here 693 | default: | ^~~~~~~ libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra lua_cjson.c -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua_cjson.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/fpconv.lo fpconv.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra fpconv.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/fpconv.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra fpconv.c -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/fpconv.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=compile aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/strbuf.lo strbuf.c libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra strbuf.c -fPIC -DPIC -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/strbuf.o libtool: compile: aarch64-linux-gnu-gcc -c -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra strbuf.c -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/strbuf.o >/dev/null 2>&1 /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc \ -rpath /usr//lib/aarch64-linux-gnu -version-info 0:0:0 -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/liblua5.4-cjson.la \ /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua_cjson.lo /build/lua-cjson-2.1.0+dfsg/5.4-cjson/fpconv.lo /build/lua-cjson-2.1.0+dfsg/5.4-cjson/strbuf.lo \ -Wl,-z,relro -Wl,-z,now libtool: link: gcc -shared -fPIC -DPIC /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/strbuf.o -Wl,--no-add-needed -Wl,-z -Wl,relro -Wl,-z -Wl,now -Wl,-soname -Wl,liblua5.4-cjson.so.0 -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.so.0.0.0 libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs" && rm -f "liblua5.4-cjson.so.0" && ln -s "liblua5.4-cjson.so.0.0.0" "liblua5.4-cjson.so.0") libtool: link: (cd "/build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs" && rm -f "liblua5.4-cjson.so" && ln -s "liblua5.4-cjson.so.0.0.0" "liblua5.4-cjson.so") libtool: link: ar cr /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.a /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua_cjson.o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/fpconv.o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/strbuf.o libtool: link: ranlib /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.a libtool: link: ( cd "/build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs" && rm -f "liblua5.4-cjson.la" && ln -s "../liblua5.4-cjson.la" "liblua5.4-cjson.la" ) ldd /build/lua-cjson-2.1.0+dfsg/5.4-cjson/cjson.so linux-vdso.so.1 (0x0000ffff825ef000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff823e0000) /lib/ld-linux-aarch64.so.1 (0x0000ffff825b2000) Target build made dh_auto_test -O--buildsystem=lua make --no-print-directory -f /usr/share/dh-lua/make/dh-lua.Makefile.multiple test Making target test for debian/lua5.1.dh-lua.conf # tests Copying lua/cjson/util.lua in /build/lua-cjson-2.1.0+dfsg/5.1-cjson for test ********************** lua dynamic (5.1) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\000\"\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\000\"\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\000\000\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\000\000\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to '?' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to '?' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.1-cjson/ \ /usr/share/dh-lua/test/5.1/app.c /build/lua-cjson-2.1.0+dfsg/5.1-cjson/liblua5.1-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.1 libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.1-cjson/ /usr/share/dh-lua/test/5.1/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.so -llua5.1 -Wl,-rpath -Wl,/usr//lib/aarch64-linux-gnu /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=execute -dlopen /build/lua-cjson-2.1.0+dfsg/5.1-cjson/liblua5.1-cjson.la \ ldd /build/lua-cjson-2.1.0+dfsg/5.1-cjson/app-dynamic linux-vdso.so.1 (0x0000ffff8ce20000) liblua5.1-cjson.so.0 => /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.so.0 (0x0000ffff8cd90000) liblua5.1.so.0 => /usr//lib/aarch64-linux-gnu/liblua5.1.so.0 (0x0000ffff8cd40000) libc.so.6 => /usr//lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff8cb90000) /lib/ld-linux-aarch64.so.1 (0x0000ffff8cde3000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff8caf0000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffff8cac0000) ********************** app dynamic (5.1) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\000\"\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\000\"\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\000\000\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\000\000\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to '?' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to '?' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -Wl,--no-add-needed \ -static -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.1-cjson/ \ /usr/share/dh-lua/test/5.1/app.c /build/lua-cjson-2.1.0+dfsg/5.1-cjson/liblua5.1-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.1 -lm -ldl libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.1 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.1-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.1-cjson/ /usr/share/dh-lua/test/5.1/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.a -llua5.1 -lm -ldl ldd /build/lua-cjson-2.1.0+dfsg/5.1-cjson/app-static linux-vdso.so.1 (0x0000ffff9a983000) liblua5.1.so.0 => /lib/aarch64-linux-gnu/liblua5.1.so.0 (0x0000ffff9a8d0000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff9a720000) /lib/ld-linux-aarch64.so.1 (0x0000ffff9a946000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff9a680000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffff9a650000) *********************** app static (5.1) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\000\"\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\000\"\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\000\000\000\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\000\000\000" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\000 \ \r !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to '?' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to '?' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** Target test made Making target test for debian/lua5.2.dh-lua.conf # tests Copying lua/cjson/util.lua in /build/lua-cjson-2.1.0+dfsg/5.2-cjson for test ********************** lua dynamic (5.2) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.2-cjson/ \ /usr/share/dh-lua/test/5.2/app.c /build/lua-cjson-2.1.0+dfsg/5.2-cjson/liblua5.2-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.2 libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.2-cjson/ /usr/share/dh-lua/test/5.2/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.so -llua5.2 -Wl,-rpath -Wl,/usr//lib/aarch64-linux-gnu /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=execute -dlopen /build/lua-cjson-2.1.0+dfsg/5.2-cjson/liblua5.2-cjson.la \ ldd /build/lua-cjson-2.1.0+dfsg/5.2-cjson/app-dynamic linux-vdso.so.1 (0x0000ffff909c6000) liblua5.2-cjson.so.0 => /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.so.0 (0x0000ffff90930000) liblua5.2.so.0 => /usr//lib/aarch64-linux-gnu/liblua5.2.so.0 (0x0000ffff908d0000) libc.so.6 => /usr//lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff90720000) /lib/ld-linux-aarch64.so.1 (0x0000ffff90989000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff90680000) ********************** app dynamic (5.2) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to '?' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to '?' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -Wl,--no-add-needed \ -static -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.2-cjson/ \ /usr/share/dh-lua/test/5.2/app.c /build/lua-cjson-2.1.0+dfsg/5.2-cjson/liblua5.2-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.2 -lm -ldl libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.2 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.2-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.2-cjson/ /usr/share/dh-lua/test/5.2/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.a -llua5.2 -lm -ldl ldd /build/lua-cjson-2.1.0+dfsg/5.2-cjson/app-static linux-vdso.so.1 (0x0000ffff8de27000) liblua5.2.so.0 => /lib/aarch64-linux-gnu/liblua5.2.so.0 (0x0000ffff8dd60000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff8dbb0000) /lib/ld-linux-aarch64.so.1 (0x0000ffff8ddea000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff8db10000) *********************** app static (5.2) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0, -5000, -1, 0.0003, 1023.2, 0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["3"] = "three", ["1"] = "one" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["b"] = "b", ["a"] = "a" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to '?' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to '?' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to '?' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to '?' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to '?' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to '?' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** Target test made Making target test for debian/lua5.3.dh-lua.conf # tests Copying lua/cjson/util.lua in /build/lua-cjson-2.1.0+dfsg/5.3-cjson for test ********************** lua dynamic (5.3) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["3"] = "three", ["1"] = "one" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["b"] = "b", ["a"] = "a" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.3-cjson/ \ /usr/share/dh-lua/test/5.3/app.c /build/lua-cjson-2.1.0+dfsg/5.3-cjson/liblua5.3-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.3 libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.3-cjson/ /usr/share/dh-lua/test/5.3/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.so -llua5.3 -Wl,-rpath -Wl,/usr//lib/aarch64-linux-gnu /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=execute -dlopen /build/lua-cjson-2.1.0+dfsg/5.3-cjson/liblua5.3-cjson.la \ ldd /build/lua-cjson-2.1.0+dfsg/5.3-cjson/app-dynamic linux-vdso.so.1 (0x0000ffff81986000) liblua5.3-cjson.so.0 => /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.so.0 (0x0000ffff818f0000) liblua5.3.so.0 => /usr//lib/aarch64-linux-gnu/liblua5.3.so.0 (0x0000ffff81890000) libc.so.6 => /usr//lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff816e0000) /lib/ld-linux-aarch64.so.1 (0x0000ffff81949000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff81640000) ********************** app dynamic (5.3) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["3"] = "three", ["1"] = "one" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["b"] = "b", ["a"] = "a" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -Wl,--no-add-needed \ -static -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.3-cjson/ \ /usr/share/dh-lua/test/5.3/app.c /build/lua-cjson-2.1.0+dfsg/5.3-cjson/liblua5.3-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.3 -lm -ldl libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.3 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.3-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.3-cjson/ /usr/share/dh-lua/test/5.3/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.a -llua5.3 -lm -ldl ldd /build/lua-cjson-2.1.0+dfsg/5.3-cjson/app-static linux-vdso.so.1 (0x0000ffffabee6000) liblua5.3.so.0 => /lib/aarch64-linux-gnu/liblua5.3.so.0 (0x0000ffffabe20000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffabc70000) /lib/ld-linux-aarch64.so.1 (0x0000ffffabea9000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffabbd0000) *********************** app static (5.3) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** Target test made Making target test for debian/lua5.4.dh-lua.conf # tests Copying lua/cjson/util.lua in /build/lua-cjson-2.1.0+dfsg/5.4-cjson for test ********************** lua dynamic (5.4) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["3"] = "three", ["1"] = "one" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["b"] = "b", ["a"] = "a" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -Wl,--no-add-needed \ -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.4-cjson/ \ /usr/share/dh-lua/test/5.4/app.c /build/lua-cjson-2.1.0+dfsg/5.4-cjson/liblua5.4-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.4 libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/app-dynamic -I . -I /build/lua-cjson-2.1.0+dfsg/5.4-cjson/ /usr/share/dh-lua/test/5.4/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.so -llua5.4 -Wl,-rpath -Wl,/usr//lib/aarch64-linux-gnu /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=execute -dlopen /build/lua-cjson-2.1.0+dfsg/5.4-cjson/liblua5.4-cjson.la \ ldd /build/lua-cjson-2.1.0+dfsg/5.4-cjson/app-dynamic linux-vdso.so.1 (0x0000ffff8e433000) liblua5.4-cjson.so.0 => /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.so.0 (0x0000ffff8e3a0000) liblua5.4.so.0 => /usr//lib/aarch64-linux-gnu/liblua5.4.so.0 (0x0000ffff8e340000) libc.so.6 => /usr//lib/aarch64-linux-gnu/libc.so.6 (0x0000ffff8e190000) /lib/ld-linux-aarch64.so.1 (0x0000ffff8e3f6000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffff8e0f0000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffff8e0c0000) ********************** app dynamic (5.4) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["1"] = "one", ["3"] = "three" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["a"] = "a", ["b"] = "b" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --mode=link aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -Wl,--no-add-needed \ -static -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.4-cjson/ \ /usr/share/dh-lua/test/5.4/app.c /build/lua-cjson-2.1.0+dfsg/5.4-cjson/liblua5.4-cjson.la \ -Wl,-z,relro -Wl,-z,now -llua5.4 -lm -ldl libtool: link: aarch64-linux-gnu-gcc -g -O2 -ffile-prefix-map=/build/lua-cjson-2.1.0+dfsg=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -I/usr/include/lua5.4 -Wall -Wextra -Wl,--no-add-needed -o /build/lua-cjson-2.1.0+dfsg/5.4-cjson/app-static -I . -I /build/lua-cjson-2.1.0+dfsg/5.4-cjson/ /usr/share/dh-lua/test/5.4/app.c -Wl,-z -Wl,relro -Wl,-z -Wl,now /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.a -llua5.4 -lm -ldl ldd /build/lua-cjson-2.1.0+dfsg/5.4-cjson/app-static linux-vdso.so.1 (0x0000ffffa802c000) liblua5.4.so.0 => /lib/aarch64-linux-gnu/liblua5.4.so.0 (0x0000ffffa7f60000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000ffffa7db0000) /lib/ld-linux-aarch64.so.1 (0x0000ffffa7fef000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000ffffa7d10000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000ffffa7ce0000) *********************** app static (5.4) ********* Test: cd tests/ && @@LUA@@ test.lua ==> Testing Lua CJSON version 2.1.0 ==> Test [1] Check module name, version: PASS [Input] { } [Received:success] { "cjson", "2.1.0" } ==> Test [2] Decode string: PASS [Input] { "\"test string\"" } [Received:success] { "test string" } ==> Test [3] Decode numbers: PASS [Input] { "[ 0.0, -5e3, -1, 0.3e-3, 1023.2, 0e10 ]" } [Received:success] { { 0.0, -5000.0, -1.0, 0.0003, 1023.2, 0.0 } } ==> Test [4] Decode null: PASS [Input] { "null" } [Received:success] { json.null } ==> Test [5] Decode true: PASS [Input] { "true" } [Received:success] { true } ==> Test [6] Decode false: PASS [Input] { "false" } [Received:success] { false } ==> Test [7] Decode object with numeric keys: PASS [Input] { "{ \"1\": \"one\", \"3\": \"three\" }" } [Received:success] { { ["3"] = "three", ["1"] = "one" } } ==> Test [8] Decode object with string keys: PASS [Input] { "{ \"a\": \"a\", \"b\": \"b\" }" } [Received:success] { { ["b"] = "b", ["a"] = "a" } } ==> Test [9] Decode array: PASS [Input] { "[ \"one\", null, \"three\" ]" } [Received:success] { { "one", json.null, "three" } } ==> Test [10] Decode UTF-16BE [throw error]: PASS [Input] { "\0\"\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [11] Decode UTF-16LE [throw error]: PASS [Input] { "\"\0\"\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [12] Decode UTF-32BE [throw error]: PASS [Input] { "\0\0\0\"" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [13] Decode UTF-32LE [throw error]: PASS [Input] { "\"\0\0\0" } [Received:error] { "JSON parser does not support UTF-16 or UTF-32" } ==> Test [14] Decode partial JSON [throw error]: PASS [Input] { "{ \"unexpected eof\": " } [Received:error] { "Expected value but found T_END at character 21" } ==> Test [15] Decode with extra comma [throw error]: PASS [Input] { "{ \"extra data\": true }, false" } [Received:error] { "Expected the end but found T_COMMA at character 23" } ==> Test [16] Decode invalid escape code [throw error]: PASS [Input] { " { \"bad escape \\q code\" } " } [Received:error] { "Expected object key string but found invalid escape code at character 16" } ==> Test [17] Decode invalid unicode escape [throw error]: PASS [Input] { " { \"bad unicode \\u0f6 escape\" } " } [Received:error] { "Expected object key string but found invalid unicode escape code at character 17" } ==> Test [18] Decode invalid keyword [throw error]: PASS [Input] { " [ \"bad barewood\", test ] " } [Received:error] { "Expected value but found invalid token at character 20" } ==> Test [19] Decode invalid number #1 [throw error]: PASS [Input] { "[ -+12 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [20] Decode invalid number #2 [throw error]: PASS [Input] { "-v" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [21] Decode invalid number exponent [throw error]: PASS [Input] { "[ 0.4eg10 ]" } [Received:error] { "Expected comma or array end but found invalid token at character 6" } ==> Test [22] Set decode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [23] Decode array at nested limit: PASS [Input] { "[[[[[ \"nested\" ]]]]]" } [Received:success] { { { { { { "nested" } } } } } } ==> Test [24] Decode array over nested limit [throw error]: PASS [Input] { "[[[[[[ \"nested\" ]]]]]]" } [Received:error] { "Found too many nested data structures (6) at character 6" } ==> Test [25] Decode object at nested limit: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } [Received:success] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } ==> Test [26] Decode object over nested limit [throw error]: PASS [Input] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":{\"f\":\"nested\"}}}}}}" } [Received:error] { "Found too many nested data structures (6) at character 26" } ==> Test [27] Set decode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [28] Decode deeply nested array [throw error]: PASS [Input] { "[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[1100]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]" } [Received:error] { "Found too many nested data structures (1001) at character 1001" } ==> Test [29] Set encode_max_depth(5): PASS [Input] { 5 } [Received:success] { 5 } ==> Test [30] Encode nested table as array at nested limit: PASS [Input] { { { { { { "nested" } } } } } } [Received:success] { "[[[[[\"nested\"]]]]]" } ==> Test [31] Encode nested table as array after nested limit [throw error]: PASS [Input] { { { { { { { "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [32] Encode nested table as object at nested limit: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = "nested" } } } } } } [Received:success] { "{\"a\":{\"b\":{\"c\":{\"d\":{\"e\":\"nested\"}}}}}" } ==> Test [33] Encode nested table as object over nested limit [throw error]: PASS [Input] { { ["a"] = { ["b"] = { ["c"] = { ["d"] = { ["e"] = { ["f"] = "nested" } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [34] Encode table with cycle [throw error]: PASS [Input] { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (6)" } ==> Test [35] Set encode_max_depth(1000): PASS [Input] { 1000 } [Received:success] { 1000 } ==> Test [36] Encode deeply nested data [throw error]: PASS [Input] { { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = { 10, false, true, json.null }, [2] = "string", ["a"] = { [1] = Cannot serialise any further: too many nested tables, [2] = "string", ["a"] = Cannot serialise any further: too many nested tables } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } [Received:error] { "Cannot serialise, excessive nesting (1001)" } ==> Test [37] Encode null: PASS [Input] { json.null } [Received:success] { "null" } ==> Test [38] Encode true: PASS [Input] { true } [Received:success] { "true" } ==> Test [39] Encode false: PASS [Input] { false } [Received:success] { "false" } ==> Test [40] Encode empty object: PASS [Input] { { } } [Received:success] { "{}" } ==> Test [41] Encode integer: PASS [Input] { 10 } [Received:success] { "10" } ==> Test [42] Encode string: PASS [Input] { "hello" } [Received:success] { "\"hello\"" } ==> Test [43] Encode Lua function [throw error]: PASS [Input] { "" } [Received:error] { "Cannot serialise function: type not supported" } ==> Test [44] Set decode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [45] Decode hexadecimal: PASS [Input] { "0x6.ffp1" } [Received:success] { 13.9921875 } ==> Test [46] Decode numbers with leading zero: PASS [Input] { "[ 0123, 00.33 ]" } [Received:success] { { 123.0, 0.33 } } ==> Test [47] Decode +-Inf: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:success] { { inf, inf, -inf } } ==> Test [48] Decode +-Infinity: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:success] { { inf, inf, -inf } } ==> Test [49] Decode +-NaN: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:success] { { nan, nan, -nan } } ==> Test [50] Decode Infrared (not infinity) [throw error]: PASS [Input] { "Infrared" } [Received:error] { "Expected the end but found invalid token at character 4" } ==> Test [51] Decode Noodle (not NaN) [throw error]: PASS [Input] { "Noodle" } [Received:error] { "Expected value but found invalid token at character 1" } ==> Test [52] Set decode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [53] Decode hexadecimal [throw error]: PASS [Input] { "0x6" } [Received:error] { "Expected value but found invalid number at character 1" } ==> Test [54] Decode numbers with leading zero [throw error]: PASS [Input] { "[ 0123, 00.33 ]" } [Received:error] { "Expected value but found invalid number at character 3" } ==> Test [55] Decode +-Inf [throw error]: PASS [Input] { "[ +Inf, Inf, -Inf ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [56] Decode +-Infinity [throw error]: PASS [Input] { "[ +Infinity, Infinity, -Infinity ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [57] Decode +-NaN [throw error]: PASS [Input] { "[ +NaN, NaN, -NaN ]" } [Received:error] { "Expected value but found invalid token at character 3" } ==> Test [58] Set decode_invalid_numbers("on"): PASS [Input] { "on" } [Received:success] { true } ==> Test [59] Set encode_invalid_numbers(false): PASS [Input] { false } [Received:success] { false } ==> Test [60] Encode NaN [throw error]: PASS [Input] { nan } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [61] Encode Infinity [throw error]: PASS [Input] { inf } [Received:error] { "Cannot serialise number: must not be NaN or Inf" } ==> Test [62] Set encode_invalid_numbers("null"): PASS [Input] { "null" } [Received:success] { "null" } ==> Test [63] Encode NaN as null: PASS [Input] { nan } [Received:success] { "null" } ==> Test [64] Encode Infinity as null: PASS [Input] { inf } [Received:success] { "null" } ==> Test [65] Set encode_invalid_numbers(true): PASS [Input] { true } [Received:success] { true } ==> Test [66] Encode NaN: PASS [Input] { nan } [Received:success] { "nan" } ==> Test [67] Encode Infinity: PASS [Input] { inf } [Received:success] { "inf" } ==> Test [68] Set encode_invalid_numbers("off"): PASS [Input] { "off" } [Received:success] { false } ==> Test [69] Set encode_sparse_array(true, 2, 3): PASS [Input] { true, 2, 3 } [Received:success] { true, 2, 3 } ==> Test [70] Encode sparse table as array #1: PASS [Input] { { [3] = "sparse test" } } [Received:success] { "[null,null,\"sparse test\"]" } ==> Test [71] Encode sparse table as array #2: PASS [Input] { { "one", nil, nil, "sparse test" } } [Received:success] { "[\"one\",null,null,\"sparse test\"]" } ==> Test [72] Encode table with numeric string key as object: PASS [Input] { { ["2"] = "numeric string key test" } } [Received:success] { "{\"2\":\"numeric string key test\"}" } ==> Test [73] Set encode_sparse_array(false): PASS [Input] { false } [Received:success] { false, 2, 3 } ==> Test [74] Encode table with incompatible key [throw error]: PASS [Input] { { [false] = "wrong" } } [Received:error] { "Cannot serialise boolean: table key must be a number or string" } ==> Test [75] Encode all octets (8-bit clean): PASS [Input] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } [Received:success] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } ==> Test [76] Decode all escaped octets: PASS [Input] { "\"\\u0000\\u0001\\u0002\\u0003\\u0004\\u0005\\u0006\\u0007\\b\\t\\n\\u000b\\f\\r\\u000e\\u000f\\u0010\\u0011\\u0012\\u0013\\u0014\\u0015\\u0016\\u0017\\u0018\\u0019\\u001a\\u001b\\u001c\\u001d\\u001e\\u001f !\\\"#$%&'()*+,-.\\/0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\\u007f\"" } [Received:success] { "\0\1\2\3\4\5\6\7\8\9\ \11\12\13\14\15\16\17\18\19\20\21\22\23\24\25\26\27\28\29\30\31 !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~\127" } ==> Test [77] Decode single UTF-16 escape: PASS [Input] { "\"\\uF800\"" } [Received:success] { "" } ==> Test [78] Decode swapped surrogate pair [throw error]: PASS [Input] { "\"\\uDC00\\uD800\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [79] Decode duplicate high surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [80] Decode duplicate low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [81] Decode missing low surrogate [throw error]: PASS [Input] { "\"\\uDB00\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Test [82] Decode invalid low surrogate [throw error]: PASS [Input] { "\"\\uDB00\\uD\"" } [Received:error] { "Expected value but found invalid unicode escape code at character 2" } ==> Set locale to cs_CZ (comma separator) ==> Test [83] Encode number under comma locale: PASS [Input] { 1,5 } [Received:success] { "1.5" } ==> Test [84] Decode number in array under comma locale: PASS [Input] { "[ 10, \"test\" ]" } [Received:success] { { 10,0, "test" } } ==> Revert locale to POSIX ==> Test [85] Set encode_keep_buffer(false): PASS [Input] { false } [Received:success] { false } ==> Test [86] Set encode_number_precision(3): PASS [Input] { 3 } [Received:success] { 3 } ==> Test [87] Encode number with precision 3: PASS [Input] { 0.33333333333333 } [Received:success] { "0.333" } ==> Test [88] Set encode_number_precision(14): PASS [Input] { 14 } [Received:success] { 14 } ==> Test [89] Set encode_keep_buffer(true): PASS [Input] { true } [Received:success] { true } ==> Test [90] Set encode_number_precision(0) [throw error]: PASS [Input] { 0 } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (expected integer between 1 and 14)" } ==> Test [91] Set encode_number_precision("five") [throw error]: PASS [Input] { "five" } [Received:error] { "bad argument #1 to 'cjson.encode_number_precision' (number expected, got string)" } ==> Test [92] Set encode_keep_buffer(nil, true) [throw error]: PASS [Input] { nil, true } [Received:error] { "bad argument #2 to 'cjson.encode_keep_buffer' (found too many arguments)" } ==> Test [93] Set encode_max_depth("wrong") [throw error]: PASS [Input] { "wrong" } [Received:error] { "bad argument #1 to 'cjson.encode_max_depth' (number expected, got string)" } ==> Test [94] Set decode_max_depth(0) [throw error]: PASS [Input] { "0" } [Received:error] { "bad argument #1 to 'cjson.decode_max_depth' (expected integer between 1 and 2147483647)" } ==> Test [95] Set encode_invalid_numbers(-2) [throw error]: PASS [Input] { -2 } [Received:error] { "bad argument #1 to 'cjson.encode_invalid_numbers' (invalid option '-2')" } ==> Test [96] Set decode_invalid_numbers(true, false) [throw error]: PASS [Input] { true, false } [Received:error] { "bad argument #2 to 'cjson.decode_invalid_numbers' (found too many arguments)" } ==> Test [97] Set encode_sparse_array("not quite on") [throw error]: PASS [Input] { "not quite on" } [Received:error] { "bad argument #1 to 'cjson.encode_sparse_array' (invalid option 'not quite on')" } ==> Reset Lua CJSON configuration ==> Test [98] Check encode_sparse_array(): PASS [Input] { } [Received:success] { false, 2, 10 } ==> Test [99] Encode (safe) simple value: PASS [Input] { true } [Received:success] { "true" } ==> Test [100] Encode (safe) argument validation [throw error]: PASS [Input] { "arg1", "arg2" } [Received:error] { "bad argument #1 to 'cjson.safe.encode' (expected 1 argument)" } ==> Test [101] Decode (safe) error generation: PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Test [102] Decode (safe) error generation after new(): PASS [Input] { "Oops" } [Received:success] { nil, "Expected value but found invalid token at character 1" } ==> Summary: all tests succeeded ************************************************** Target test made create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=lua dh_prep -O--buildsystem=lua dh_auto_install -O--buildsystem=lua make --no-print-directory -f /usr/share/dh-lua/make/dh-lua.Makefile.multiple install /build/lua-cjson-2.1.0\+dfsg/debian/tmp Making target install for debian/lua5.1.dh-lua.conf # .lua Installing lua/cjson/util.lua in debian/tmp/usr//share/lua/5.1 # debian/substvars Filling in debian/lua-cjson.substvars Adding new line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson-dev.substvars Adding new line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson.substvars Adding new line: lua:Provides=lua5.4-cjson, lua5.3-cjson, lua5.2-cjson, lua5.1-cjson, Filling in debian/lua-cjson-dev.substvars Adding new line: lua:Provides=lua5.4-cjson-dev, lua5.3-cjson-dev, lua5.2-cjson-dev, lua5.1-cjson-dev, # .so Installing liblua5.1-cjson libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.so.0.0.0 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.1-cjson.so.0.0.0 liblua5.1-cjson.so.0 || { rm -f liblua5.1-cjson.so.0 && ln -s liblua5.1-cjson.so.0.0.0 liblua5.1-cjson.so.0; }; }) libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.1-cjson.so.0.0.0 liblua5.1-cjson.so || { rm -f liblua5.1-cjson.so && ln -s liblua5.1-cjson.so.0.0.0 liblua5.1-cjson.so; }; }) libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.lai /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.1-cjson.la libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.1-cjson/.libs/liblua5.1-cjson.a /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.1-cjson.a libtool: install: chmod 644 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.1-cjson.a libtool: install: ranlib /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.1-cjson.a libtool: warning: remember to run 'libtool --finish /usr//lib/aarch64-linux-gnu' /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --finish debian/tmp/usr//lib/aarch64-linux-gnu libtool: finish: PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/sbin" ldconfig -n debian/tmp/usr//lib/aarch64-linux-gnu ---------------------------------------------------------------------- Libraries have been installed in: debian/tmp/usr//lib/aarch64-linux-gnu If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the '-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the 'LD_RUN_PATH' environment variable during linking - use the '-Wl,-rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to '/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Creating symlink cjson.so # .pc Installing lua5.1-cjson.pc libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua5.1-cjson.pc /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/pkgconfig/lua5.1-cjson.pc # .h Installing /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua-cjson.h libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.1-cjson/lua-cjson.h /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//include/lua5.1/lua-cjson.h Target install made Making target install for debian/lua5.2.dh-lua.conf # .lua Installing lua/cjson/util.lua in debian/tmp/usr//share/lua/5.2 # debian/substvars Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Provides=lua5.4-cjson, lua5.3-cjson, lua5.2-cjson, lua5.1-cjson, Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Provides=lua5.4-cjson-dev, lua5.3-cjson-dev, lua5.2-cjson-dev, lua5.1-cjson-dev, # .so Installing liblua5.2-cjson libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.so.0.0.0 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.2-cjson.so.0.0.0 liblua5.2-cjson.so.0 || { rm -f liblua5.2-cjson.so.0 && ln -s liblua5.2-cjson.so.0.0.0 liblua5.2-cjson.so.0; }; }) libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.2-cjson.so.0.0.0 liblua5.2-cjson.so || { rm -f liblua5.2-cjson.so && ln -s liblua5.2-cjson.so.0.0.0 liblua5.2-cjson.so; }; }) libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.lai /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.2-cjson.la libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.2-cjson/.libs/liblua5.2-cjson.a /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.2-cjson.a libtool: install: chmod 644 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.2-cjson.a libtool: install: ranlib /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.2-cjson.a libtool: warning: remember to run 'libtool --finish /usr//lib/aarch64-linux-gnu' /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --finish debian/tmp/usr//lib/aarch64-linux-gnu libtool: finish: PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/sbin" ldconfig -n debian/tmp/usr//lib/aarch64-linux-gnu ---------------------------------------------------------------------- Libraries have been installed in: debian/tmp/usr//lib/aarch64-linux-gnu If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the '-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the 'LD_RUN_PATH' environment variable during linking - use the '-Wl,-rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to '/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Creating symlink cjson.so # .pc Installing lua5.2-cjson.pc libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua5.2-cjson.pc /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/pkgconfig/lua5.2-cjson.pc # .h Installing /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua-cjson.h libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.2-cjson/lua-cjson.h /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//include/lua5.2/lua-cjson.h Target install made Making target install for debian/lua5.3.dh-lua.conf # .lua Installing lua/cjson/util.lua in debian/tmp/usr//share/lua/5.3 # debian/substvars Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Provides=lua5.4-cjson, lua5.3-cjson, lua5.2-cjson, lua5.1-cjson, Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Provides=lua5.4-cjson-dev, lua5.3-cjson-dev, lua5.2-cjson-dev, lua5.1-cjson-dev, # .so Installing liblua5.3-cjson libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.so.0.0.0 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.3-cjson.so.0.0.0 liblua5.3-cjson.so.0 || { rm -f liblua5.3-cjson.so.0 && ln -s liblua5.3-cjson.so.0.0.0 liblua5.3-cjson.so.0; }; }) libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.3-cjson.so.0.0.0 liblua5.3-cjson.so || { rm -f liblua5.3-cjson.so && ln -s liblua5.3-cjson.so.0.0.0 liblua5.3-cjson.so; }; }) libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.lai /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.3-cjson.la libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.3-cjson/.libs/liblua5.3-cjson.a /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.3-cjson.a libtool: install: chmod 644 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.3-cjson.a libtool: install: ranlib /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.3-cjson.a libtool: warning: remember to run 'libtool --finish /usr//lib/aarch64-linux-gnu' /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --finish debian/tmp/usr//lib/aarch64-linux-gnu libtool: finish: PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/sbin" ldconfig -n debian/tmp/usr//lib/aarch64-linux-gnu ---------------------------------------------------------------------- Libraries have been installed in: debian/tmp/usr//lib/aarch64-linux-gnu If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the '-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the 'LD_RUN_PATH' environment variable during linking - use the '-Wl,-rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to '/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Creating symlink cjson.so # .pc Installing lua5.3-cjson.pc libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua5.3-cjson.pc /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/pkgconfig/lua5.3-cjson.pc # .h Installing /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua-cjson.h libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.3-cjson/lua-cjson.h /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//include/lua5.3/lua-cjson.h Target install made Making target install for debian/lua5.4.dh-lua.conf # .lua Installing lua/cjson/util.lua in debian/tmp/usr//share/lua/5.4 # debian/substvars Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Versions=5.1 5.2 5.3 5.4 Filling in debian/lua-cjson.substvars Skipping already existing line: lua:Provides=lua5.4-cjson, lua5.3-cjson, lua5.2-cjson, lua5.1-cjson, Filling in debian/lua-cjson-dev.substvars Skipping already existing line: lua:Provides=lua5.4-cjson-dev, lua5.3-cjson-dev, lua5.2-cjson-dev, lua5.1-cjson-dev, # .so Installing liblua5.4-cjson libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.so.0.0.0 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.4-cjson.so.0.0.0 liblua5.4-cjson.so.0 || { rm -f liblua5.4-cjson.so.0 && ln -s liblua5.4-cjson.so.0.0.0 liblua5.4-cjson.so.0; }; }) libtool: install: (cd /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu && { ln -s -f liblua5.4-cjson.so.0.0.0 liblua5.4-cjson.so || { rm -f liblua5.4-cjson.so && ln -s liblua5.4-cjson.so.0.0.0 liblua5.4-cjson.so; }; }) libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.lai /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.4-cjson.la libtool: install: install /build/lua-cjson-2.1.0+dfsg/5.4-cjson/.libs/liblua5.4-cjson.a /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.4-cjson.a libtool: install: chmod 644 /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.4-cjson.a libtool: install: ranlib /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/liblua5.4-cjson.a libtool: warning: remember to run 'libtool --finish /usr//lib/aarch64-linux-gnu' /build/lua-cjson-2.1.0+dfsg/debian/.dh_lua-libtool/libtool --tag=CC --finish debian/tmp/usr//lib/aarch64-linux-gnu libtool: finish: PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/sbin" ldconfig -n debian/tmp/usr//lib/aarch64-linux-gnu ---------------------------------------------------------------------- Libraries have been installed in: debian/tmp/usr//lib/aarch64-linux-gnu If you ever happen to want to link against installed libraries in a given directory, LIBDIR, you must either use libtool, and specify the full pathname of the library, or use the '-LLIBDIR' flag during linking and do at least one of the following: - add LIBDIR to the 'LD_LIBRARY_PATH' environment variable during execution - add LIBDIR to the 'LD_RUN_PATH' environment variable during linking - use the '-Wl,-rpath -Wl,LIBDIR' linker flag - have your system administrator add LIBDIR to '/etc/ld.so.conf' See any operating system documentation about shared libraries for more information, such as the ld(1) and ld.so(8) manual pages. ---------------------------------------------------------------------- Creating symlink cjson.so # .pc Installing lua5.4-cjson.pc libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua5.4-cjson.pc /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//lib/aarch64-linux-gnu/pkgconfig/lua5.4-cjson.pc # .h Installing /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua-cjson.h libtool: install: install -m 0644 /build/lua-cjson-2.1.0+dfsg/5.4-cjson/lua-cjson.h /build/lua-cjson-2.1.0+dfsg/debian/tmp/usr//include/lua5.4/lua-cjson.h Target install made make[1]: Nothing to be done for '/build/lua-cjson-2.1.0+dfsg/debian/tmp'. dh_install -O--buildsystem=lua dh_lua -O--buildsystem=lua deduplicating cjson/util.lua deduplicating cjson/util.lua deduplicating cjson/util.lua dh_installdocs -O--buildsystem=lua dh_installchangelogs -O--buildsystem=lua dh_installsystemduser -O--buildsystem=lua dh_perl -O--buildsystem=lua dh_link -O--buildsystem=lua dh_strip_nondeterminism -O--buildsystem=lua dh_compress -X.lua -O--buildsystem=lua dh_fixperms -O--buildsystem=lua dh_missing -O--buildsystem=lua dh_dwz -a -O--buildsystem=lua dh_strip -a -O--buildsystem=lua dh_makeshlibs -a -O--buildsystem=lua dh_shlibdeps -a -O--buildsystem=lua dpkg-shlibdeps: warning: symbol lua_pcall used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushinteger used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushboolean used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushvalue used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_rawseti used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_next used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol luaL_error used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_setmetatable used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushnumber used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_type used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.1-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: 25 other similar warnings have been skipped (use -v to see them all) dpkg-shlibdeps: warning: symbol lua_toboolean used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushlstring used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_createtable used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_rawset used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_typename used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_tolstring used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_tonumberx used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushnumber used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_setmetatable used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol luaL_error used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.3-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: 25 other similar warnings have been skipped (use -v to see them all) dpkg-shlibdeps: warning: symbol luaL_checklstring used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_touserdata used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushnil used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol luaL_checkoption used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_newuserdata used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushcclosure used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol luaL_checkinteger used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_getfield used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_settop used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_insert used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.2-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: 25 other similar warnings have been skipped (use -v to see them all) dpkg-shlibdeps: warning: symbol luaL_argerror used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushstring used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_setfield used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_gettop used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pushlightuserdata used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_rawgeti used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_checkstack used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_pcallk used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol lua_touserdata used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: symbol luaL_checklstring used by debian/lua-cjson/usr/lib/aarch64-linux-gnu/liblua5.4-cjson.so.0.0.0 found in none of the libraries dpkg-shlibdeps: warning: 25 other similar warnings have been skipped (use -v to see them all) dh_installdeb -O--buildsystem=lua dh_gencontrol -O--buildsystem=lua dh_md5sums -O--buildsystem=lua dh_builddeb -O--buildsystem=lua dpkg-deb: building package 'lua-cjson' in '../lua-cjson_2.1.0+dfsg-2.2_arm64.deb'. dpkg-deb: building package 'lua-cjson-dbgsym' in '../lua-cjson-dbgsym_2.1.0+dfsg-2.2_arm64.deb'. dpkg-deb: building package 'lua-cjson-dev' in '../lua-cjson-dev_2.1.0+dfsg-2.2_arm64.deb'. dpkg-genbuildinfo --build=binary -O../lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo dpkg-genchanges --build=binary -O../lua-cjson_2.1.0+dfsg-2.2_arm64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/20159 and its subdirectories I: Current time: Tue May 14 06:50:22 -12 2024 I: pbuilder-time-stamp: 1715712622 Wed Apr 12 12:27:25 UTC 2023 I: 1st build successful. Starting 2nd build on remote node codethink16-arm64.debian.net. Wed Apr 12 12:27:25 UTC 2023 I: Preparing to do remote build '2' on codethink16-arm64.debian.net. Wed Apr 12 12:31:11 UTC 2023 I: Deleting $TMPDIR on codethink16-arm64.debian.net. Wed Apr 12 12:31:12 UTC 2023 I: lua-cjson_2.1.0+dfsg-2.2_arm64.changes: Format: 1.8 Date: Thu, 01 Dec 2022 00:26:18 +0800 Source: lua-cjson Binary: lua-cjson lua-cjson-dbgsym lua-cjson-dev Architecture: arm64 Version: 2.1.0+dfsg-2.2 Distribution: unstable Urgency: medium Maintainer: The Debian Lua Team Changed-By: Yangfl Description: lua-cjson - JSON parser/encoder for Lua lua-cjson-dev - JSON parser/encoder for Lua, development files Closes: 872599 942569 Changes: lua-cjson (2.1.0+dfsg-2.2) unstable; urgency=medium . * Non-maintainer upload. * Add lua version 5.3 (Closes: #872599, #942569). * Add lua version 5.4. * Add Rules-Requires-Root: no. * Add hardening options. * Bump debhelper compat to 13. * Bump Standards-Version to 4.6.1. . [ Boyuan Yang ] * debian/control: Migrate Vcs-* fields to Salsa lua-team. * debian/copyright: Use latest machine-readable copyright format. Checksums-Sha1: 21ae501bd4f3a4470663a7e3b0ea35c8c8ea8404 105924 lua-cjson-dbgsym_2.1.0+dfsg-2.2_arm64.deb 0d1cb6218756faf946070200e624baa3830ba219 32516 lua-cjson-dev_2.1.0+dfsg-2.2_arm64.deb 8616ded47498f0bf997c5f2ac5ec836f9f500e00 6086 lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo a60dd6eb9bb5e6f80169e2ef3707967d054ea29a 19104 lua-cjson_2.1.0+dfsg-2.2_arm64.deb Checksums-Sha256: 7550a72e5c49fedbf0940c3bba0e61c6f3764a3e318f3022303f6befe9b7bffc 105924 lua-cjson-dbgsym_2.1.0+dfsg-2.2_arm64.deb 6d76fa37b6d06e06c1010d63abcb0d1f1feef1254ee041e26e913ede37ed25db 32516 lua-cjson-dev_2.1.0+dfsg-2.2_arm64.deb 3487aec6219e14d26cb159b6831395f969a4d60683aeeba8c5efa4b60721f3cf 6086 lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo 0e12490a2de67ecbd034de73eb9722c01c29edf5f745e1240e62e7c85be4244a 19104 lua-cjson_2.1.0+dfsg-2.2_arm64.deb Files: 0089e68bc39e6c20ea2136b36059fe15 105924 debug optional lua-cjson-dbgsym_2.1.0+dfsg-2.2_arm64.deb a3766ad1e78936ca68854acc6a3db2f7 32516 libdevel optional lua-cjson-dev_2.1.0+dfsg-2.2_arm64.deb 08ce93b1296011ea0cb71598b743ae72 6086 interpreters optional lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo eb148fe32e2bf1cece104acf4038264b 19104 interpreters optional lua-cjson_2.1.0+dfsg-2.2_arm64.deb Wed Apr 12 12:31:13 UTC 2023 I: diffoscope 240 will be used to compare the two builds: # Profiling output for: /usr/bin/diffoscope --timeout 7200 --html /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/lua-cjson_2.1.0+dfsg-2.2.diffoscope.html --text /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/lua-cjson_2.1.0+dfsg-2.2.diffoscope.txt --json /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/lua-cjson_2.1.0+dfsg-2.2.diffoscope.json --profile=- /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/b1/lua-cjson_2.1.0+dfsg-2.2_arm64.changes /srv/reproducible-results/rbuild-debian/r-b-build.kKq3SN5t/b2/lua-cjson_2.1.0+dfsg-2.2_arm64.changes ## command (total time: 0.000s) 0.000s 1 call cmp (internal) ## has_same_content_as (total time: 0.000s) 0.000s 1 call abc.DotChangesFile ## main (total time: 0.377s) 0.377s 2 calls outputs 0.000s 1 call cleanup ## recognizes (total time: 0.020s) 0.020s 12 calls diffoscope.comparators.binary.FilesystemFile 0.000s 10 calls abc.DotChangesFile ## specialize (total time: 0.000s) 0.000s 1 call specialize Wed Apr 12 12:31:16 UTC 2023 I: diffoscope 240 found no differences in the changes files, and a .buildinfo file also exists. Wed Apr 12 12:31:16 UTC 2023 I: lua-cjson from bookworm built successfully and reproducibly on arm64. Wed Apr 12 12:31:17 UTC 2023 I: Submitting .buildinfo files to external archives: Wed Apr 12 12:31:17 UTC 2023 I: Submitting 8.0K b1/lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo.asc Wed Apr 12 12:31:19 UTC 2023 I: Submitting 8.0K b2/lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo.asc Wed Apr 12 12:31:20 UTC 2023 I: Done submitting .buildinfo files to http://buildinfo.debian.net/api/submit. Wed Apr 12 12:31:20 UTC 2023 I: Done submitting .buildinfo files. Wed Apr 12 12:31:20 UTC 2023 I: Removing signed lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo.asc files: removed './b1/lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo.asc' removed './b2/lua-cjson_2.1.0+dfsg-2.2_arm64.buildinfo.asc'