2026-02-02 17:38:49 its because of https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/96843#note_580462 2026-02-02 18:40:45 hmm 2026-02-02 18:40:50 what hmm 2026-02-02 18:41:12 at the qtwebengine 2026-02-02 18:41:18 oh 2026-02-02 18:41:23 didn't see your message just now 2026-02-02 18:41:35 yeah waiting for https://gitlab.alpinelinux.org/alpine/aports/-/jobs/2202579 2026-02-02 18:41:40 see https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/96892 2026-02-02 19:07:15 algitbot: retry master 2026-02-02 19:07:20 Damn it riscv64... 2026-02-02 19:07:51 my most loved architecture 2026-02-03 01:45:22 =/ 2026-02-03 01:50:42 arti did not fail like that on arm* in CI 2026-02-03 02:01:07 fewer tests failed this time... 2026-02-03 02:01:24 (6 instead of 32) 2026-02-03 06:17:10 sorry about ^^ 2026-02-03 06:19:49 thanks for fixing :) 2026-02-03 11:36:54 algitbot: retry master 2026-02-04 16:14:19 Fun times are to be had 2026-02-04 16:15:27 we wouldn't do this if it wasn't fun, would we ;) 2026-02-04 16:16:49 achill: ftr, this is not a comment on your work 2026-02-04 16:26:07 Get "https://humungus.tedunangst.com/r/webs?go-get=1": dial tcp 23.227.131.12:443: i/o timeout 2026-02-04 16:39:35 algitbot: retry master 2026-02-04 16:39:54 yeah i really love the go ecosystem 2026-02-04 16:42:47 git-lfs: t-pull.sh failed with a dump of logs to stdout 2026-02-04 16:54:12 grafana: FAIL: TestAlertRuleRetry/first_attempt (0.01s) 2026-02-04 16:54:17 Condition never satisfied 2026-02-04 17:09:39 I thought I fixed etcd :( 2026-02-04 17:10:50 :( meow 2026-02-04 18:13:12 hmmm, etcd is even downloading the toolchain :( 2026-02-04 18:13:51 go: downloading go1.23.11 (linux/amd64) 2026-02-04 18:23:08 eureka 2026-02-04 18:23:11 FORCE_HOST_GO=1 2026-02-04 18:34:18 And ftr, chmod-clean only affects srcdir, not tmpdir, nor pkgdir 2026-02-04 18:42:35 algitbot: retry master 2026-02-04 19:09:58 ^drunken test 2026-02-04 19:10:26 "[..] /users/followers/reject/a.localdomain/user/bob 🔴 Reject[..]" "does not contain "🔴 Reject" line 2026-02-04 19:10:28 " 2026-02-04 19:23:40 ^ racy test 2026-02-05 01:52:24 edge-aarch64 zot tests failed with EMFILE: .{"level":"panic","error":"too many open files","goroutine":131,"caller":"zotregistry.dev/zot/pkg/api/controller.go:110","time":"2026-02-05T01:49:44.086727985Z","message":"failed to create htpasswd watcher"} 2026-02-05 06:07:41 algitbot: retry master 2026-02-05 09:50:18 algitbot: retry master 2026-02-05 11:36:39 headscale seams racy 2026-02-05 11:37:27 not surprised about that tbh (having run headscale in production with... somewhat mixed results) 2026-02-05 11:57:11 algitbot: retry 3.23-stable 2026-02-05 12:19:13 =/ 2026-02-05 12:20:55 rerunning the aarch64 CI pipeline for nuclei 2026-02-05 12:43:44 algitbot: retry master 2026-02-05 12:54:55 I'm checking nuclei on aarch64 2026-02-05 13:02:01 algitbot: retry 3.23-stable 2026-02-05 13:21:20 ikke: thanks 2026-02-05 13:27:25 ¯\_(ツ)_/¯ 2026-02-05 13:38:10 betula needs to be upgraded 2026-02-05 16:15:46 algitbot: retry master 2026-02-05 16:24:59 algitbot: retry 3.23-stable 2026-02-05 16:37:32 algitbot: retry 3.23-stable 2026-02-05 17:26:31 algitbot: retry 3.23-stable 2026-02-07 08:03:28 achill: you bumped the pkgver instead of pkgrel 2026-02-07 08:03:41 _kver instead of _krel 2026-02-07 08:22:02 ah yeah 2026-02-07 08:22:10 i hate rebases not knowing pkgrels 2026-02-07 12:23:02 algitbot: retry master 2026-02-07 21:03:53 algitbot: retry master 2026-02-08 05:14:56 algitbot: retry master 2026-02-08 10:19:49 [ FAILED ] TestUserMetricsService.PersistsDataSourcesBetweenRestart (14433 ms) 2026-02-08 10:44:32 !97189 2026-02-08 10:58:21 =/ 2026-02-08 11:06:39 !97190 2026-02-08 15:44:34 \o/ 2026-02-09 18:33:14 algitbot: retry 3.23-stable 2026-02-09 20:52:18 algitbot: retry master 2026-02-10 01:06:55 algitbot: retry 3.22-stable 2026-02-10 12:43:16 cogitri \o/ 2026-02-10 12:43:50 o/ 2026-02-10 15:12:54 algitbot: retry 3.23-stable 2026-02-11 02:10:42 !97305 2026-02-11 02:11:09 and testing/electron will need to be rebuilt 2026-02-11 03:27:08 😬 2026-02-11 06:48:07 chromium now fails to build.. 2026-02-11 06:48:58 pipelines never finished 2026-02-11 11:41:44 algitbot: retry master 2026-02-11 16:44:14 php failures are random, already reported upstream 2026-02-11 16:44:31 if it will remain to fail better to disable test 2026-02-11 16:47:51 algitbot: retry master 2026-02-11 18:35:20 algitbot: retry master 2026-02-11 18:50:32 algitbot: retry master 2026-02-11 22:11:47 algitbot: retry master 2026-02-12 07:37:30 asymptote maintainer notified 2026-02-12 15:06:21 what's with asymptote? 2026-02-12 15:12:34 possible fix in !97405 2026-02-12 15:17:47 omni: stuck 2026-02-12 15:18:25 in conversation with maintainer to unstuck, pending fix in the MR 2026-02-12 15:19:19 suggested they temporarily add py3-cson to show passing ci, then remove again since it hasn't uploaded yet on the stuck arches 2026-02-12 15:20:07 as a basc check, not to test the unstuck fix 2026-02-12 15:20:13 s/basc/basic/ 2026-02-12 15:21:09 (or not directly test since it doesn't hang in ci) 2026-02-12 15:21:36 algitbot: retry master 2026-02-12 15:22:44 was going to wait after MR is merged to request release on the builders, so they can retry with the fix 2026-02-12 15:22:59 I can do it again after merge 2026-02-12 15:23:02 if necessary 2026-02-12 15:23:18 They're now building other packages 2026-02-12 15:23:39 okay, thanks! 2026-02-12 15:32:32 armv7 may need another nudge later if it doesn't fail to build docs 2026-02-12 15:34:07 algitbot: retry master 2026-02-12 15:44:15 should clear up shortly 2026-02-12 16:59:03 algitbot: retry master 2026-02-12 17:54:15 FAIL: test-pthread-rwlock 2026-02-12 17:54:31 build-edge-riscv64: failed to build gnutls: ^^ 2026-02-12 17:54:33 gnulib test? 2026-02-12 17:54:54 fyi, it hangs, and then I kill it 2026-02-12 17:55:45 https://git.alpinelinux.org/aports/commit/?id=75d3777e6a0e17aa6789dc7aa54cfc59ed9c1924 2026-02-12 17:56:53 https://lists.gnu.org/archive/html/bug-gnulib/2025-05/msg00174.html 2026-02-12 17:57:26 also seen previously in gettext 2026-02-12 21:01:37 algitbot: retry master 2026-02-13 10:13:56 algitbot: retry master 2026-02-14 07:23:50 algitbot: retry master 2026-02-14 08:03:42 .. 2026-02-15 14:13:00 achill: ^^ 2026-02-15 14:13:10 I suspect a lot of packages will fail to build now 2026-02-15 14:13:11 ah yes 2026-02-15 14:13:23 typical breaking change in python libraries i guess 2026-02-15 14:13:53 maybe i should check all packages depending on setuptools and see if i can fix them 2026-02-15 14:14:03 That's going to be a lot of work 2026-02-15 14:14:16 gentoo is also strugling with it 2026-02-15 14:14:20 oh 2026-02-15 14:15:45 https://lore.kernel.org/distributions/f067f68c88630dd1cb6180d2773fe4b6588d2013.camel@gentoo.org/T/#u 2026-02-15 14:16:18 ah thanks, i thought i was subscribed to that list 2026-02-15 14:17:52 > 2026-02-15 14:17:52 libsass-python: This repository was archived by the owner on Oct 24, 2025. It is now read-only. 2026-02-15 14:17:53 awesome 2026-02-15 14:19:03 only mint-thems seems to depend on it 2026-02-15 14:19:40 there is also a patch in a issue which im gonna test now 2026-02-15 14:20:43 achill: there is an issue assigned to you with another package that misses pkg_resources 2026-02-15 14:21:15 ah yeah gnome-secrets is just at a old version, latest version is probably fixes but needs some patch rebasing 2026-02-15 14:21:52 But unless we do something, we will run into quite some issues at release time 2026-02-15 14:22:08 This is quite a breaking change 2026-02-15 14:23:21 hmm yea 2026-02-15 14:23:49 if decide to distribute pkg_resources as a different package we can ship it, but in the meantime we should at least try to fix-up the packages 2026-02-15 14:24:21 if we decide at 3.24-builder-start that its still too much we can probably still downgrade setuptools again 2026-02-15 15:25:33 algitbot: retry master 2026-02-16 01:01:22 algitbot: kick build-edge-x86_^4 2026-02-16 01:01:23 s/x86_^4/x86_64/ 2026-02-16 01:01:27 algitbot: kick build-edge-x86_64 2026-02-16 01:01:52 algitbot: kick edge-armv7 2026-02-16 11:48:39 achill: sadly just adding gpgme-dev doesn't seem to help 2026-02-17 12:13:14 oh come on 2026-02-17 19:41:01 KDE's mirrors are awful atm... 2026-02-17 19:42:36 Do you have the source files? 2026-02-17 20:29:23 ikke: not personally but the CDN will most of the time give a good mirror, just sometimes a bad one. Retrying enough eventually gets through 2026-02-17 20:39:26 algitbot: retry master 2026-02-17 21:40:50 algitbot: retry master 2026-02-17 23:59:03 not few 404s above 2026-02-17 23:59:07 algitbot: retry master 2026-02-17 23:59:55 how did this happen? were they there at one point? 2026-02-17 23:59:57 algitbot: retry master 2026-02-18 00:26:55 algitbot: retry master 2026-02-18 00:27:15 algitbot: retry master 2026-02-18 00:27:26 algitbot: retry master 2026-02-18 00:36:18 algitbot: retry master 2026-02-18 00:36:54 algitbot: retry master 2026-02-18 00:37:18 algitbot: retry master 2026-02-18 00:37:42 algitbot: retry master 2026-02-18 00:45:11 algitbot: retry master 2026-02-18 00:45:30 algitbot: retry master 2026-02-18 00:50:31 algitbot: retry master 2026-02-18 00:51:00 algitbot: retry master 2026-02-18 00:52:37 algitbot: retry master 2026-02-18 00:54:56 algitbot: retry master 2026-02-18 00:55:20 algitbot: retry master 2026-02-18 00:55:29 algitbot: retry master 2026-02-18 00:55:56 algitbot: retry master 2026-02-18 00:56:42 algitbot: retry master 2026-02-18 16:37:40 algitbot: retry master 2026-02-18 18:55:44 expected, but I needed to know 2026-02-18 18:57:26 FAIL src/test/test_sandbox.c:286: stat: Operation not permitted [1] 2026-02-18 18:57:28 this one? 2026-02-18 18:57:32 yes 2026-02-18 18:57:47 https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/97705#note_584889 2026-02-18 18:58:33 ikke: is that something you'd like to look at before I update to skip it? 2026-02-18 18:58:53 Do you have any clue what to look for? 2026-02-18 18:59:10 and/or why it's failing? 2026-02-18 18:59:19 nope, sorry 2026-02-18 18:59:40 Doesn't the test failure imply that something is broken? 2026-02-18 19:00:05 it doesn't fail on later stable releases or edge 2026-02-18 19:04:31 I was also previously patching that test out, now I'm skipping (fewer) tests in another way 2026-02-18 19:08:20 ikke: upstream says they don't have access to that architecture to properly debug the sandboxing parts 2026-02-18 19:11:22 Wouldn't it be better to disable the package then? Sandboxing sounds like an important part of it's security aspect. Or is that not the casE? 2026-02-18 19:38:32 ikke: I hear that this sandboxing isn't used much and sandboxing tor is usually done by other means 2026-02-18 19:41:53 omni: ok, then I guess it does not matter too much to disable this test 2026-02-18 19:42:59 ikke: I'm not too worried, since it does not imply that tor would be less sandboxed when the test is failing like this, but rather the opposite 2026-02-18 19:43:27 still curious about what is different on loongarch64 between 3.21 and 3.22 2026-02-18 19:43:44 omni: you mean because it's blocking more than it should rather then letting things through 2026-02-18 19:45:06 yes 2026-02-18 19:45:44 ack 2026-02-18 19:46:52 perhaps it's https://github.com/seccomp/libseccomp/releases/tag/v2.6.0 2026-02-18 19:47:26 in 3.21 libseccomp is at 2.5.5 with a loongarch64.patch 2026-02-19 14:05:30 plasma-desktop@riscv64: 2026-02-19 14:05:31 11/11 Test #4: foldermodeltest .....................***Failed 16.00 sec 2026-02-19 14:06:01 (in case it will fail on something else next time) 2026-02-19 15:20:31 ithread 'tests::echo::echo_dict' (803) panicked at test-output/src/tests/echo.rs:64:6: 2026-02-19 15:44:33 gleam@riscv64 failed on: 2026-02-19 15:44:34 ---- tests::echo::echo_dict stdout ---- 2026-02-20 14:02:21 bleh 2026-02-20 14:03:20 bleh 2026-02-21 13:39:54 algitbot: retry 3.21-stable 2026-02-21 16:24:38 algitbot: retry 3.20-stable 2026-02-23 16:26:24 algitbot: retry master 2026-02-25 12:21:21 algitbot: retry master 2026-02-25 14:21:23 algitbot: retry master 2026-02-25 19:11:25 algitbot: retry master 2026-02-27 08:58:35 algitbot: retry master 2026-02-27 16:42:41 wow, riscv64 is up to date \o/