2025-10-01 05:36:25 oops 2025-10-01 06:33:59 algitbot: retry 3.20-stable 2025-10-01 07:22:25 build-3-21-riscv64 and build-3-22-riscv64 look stuck 2025-10-01 08:08:58 algitbot: retry master 2025-10-01 15:13:55 algitbot: retry 3.22-stable 2025-10-01 15:16:40 algitbot: retry 3.22-stable 2025-10-01 15:18:34 algitbot: retry 3.22-stable 2025-10-01 15:52:39 algitbot: retry 3.17-stable 2025-10-01 15:57:10 algitbot: retry 3.18-stable 2025-10-01 16:11:43 algitbot: retry master 2025-10-01 19:55:54 algitbot: retry 3.21-stable 2025-10-02 05:14:37 mio: !90925 2025-10-02 05:17:58 ikke: okay. was looking it at for a while, only added since the other was still in draft 2025-10-02 05:42:24 mio: you could try to ask about that on #-loongarch 2025-10-02 05:48:08 ikke: good idea, thanks 2025-10-02 06:43:36 a temporary measure, hopefully gives maintainer a bit of time to work on the upgrade 2025-10-02 10:28:46 algitbot: retry 3.22-stable 2025-10-02 11:28:22 algitbot: retry amster 2025-10-02 12:11:06 algitbot: retry master 2025-10-02 14:29:16 algitbot: retry master 2025-10-02 15:00:18 ancient bug report: https://bugs.launchpad.net/compiz/+bug/1021139 2025-10-04 05:38:19 fossdd: fyi, it's not a good idea to merge chromium for 2 branches at the same time 2025-10-04 05:38:47 it will mean the aarch64 host will try to build 4 of them at the same time 2025-10-04 11:22:47 algitbot: retry master 2025-10-05 09:08:28 what's with aarch64? (both edge and 3.22) 2025-10-05 10:33:16 omni: the builder has network issues 2025-10-05 10:41:47 ok 2025-10-05 11:11:35 something is missing with qt6-qtwebengine 6.9.3-r0, pages render blank in qutebrowser and falkon but they seem to load and you can see links when you hover over them 2025-10-05 11:12:47 downgrading qt6-qtwebengine to 6.9.2-r2 works, while keeping all the other qt6 packages at 6.9.3 2025-10-06 08:34:16 algitbot: retry 3.22-stable 2025-10-06 10:47:28 algitbot: retry master 2025-10-07 06:49:15 =( 2025-10-07 06:49:26 algitbot: retry 3.19-stable 2025-10-07 06:49:50 ah, fetch failed 2025-10-07 06:57:30 algitbot: retry 3.19-stable 2025-10-07 10:14:39 algitbot: retry master 2025-10-07 10:14:43 algitbot: retry 3.22-stable 2025-10-07 16:17:24 algitbot: retry master 2025-10-07 16:17:49 why do they still have py3-astroid 4 hmmm 2025-10-07 16:19:16 pkgname=py3-astroid 2025-10-07 16:19:18 pkgver=3.3.11 2025-10-07 16:19:55 yeah but the builder still have the old one: 2025-10-07 16:19:55 (17/84) Installing py3-astroid (4.0.0-r0) 2025-10-07 16:20:35 fossdd_: you downgraded 2025-10-07 16:20:37 not upgraded 2025-10-07 16:20:46 4.0.0 -> 3.3.11 2025-10-07 16:21:21 yeah this was intentional since they are not compatible with each other, seems like i messed up the commit message 2025-10-07 16:21:35 well, then you need to explicitly pull in ~3 2025-10-07 16:22:10 otherwise apk wil favor the highest version, which will only be removed after all builds are completed 2025-10-07 16:24:44 ahh right because the mirrors already have 4 and its not just locally 2025-10-07 19:38:09 fossdd_: no, the builders do not really care what's on the mirrors 2025-10-07 19:38:14 s/really/ 2025-10-07 19:38:48 When you upgrade (or downgrade) a package, the new package is appended to the index, only at the end, the old (or new) version is removed 2025-10-07 19:39:04 So as long as a newer version is available, apk will use that newer version 2025-10-07 22:54:56 algitbot: retry 3.19-stable 2025-10-07 22:56:35 algitbot: retry 3.22-stable 2025-10-07 22:59:14 algitbot: retry 3.19-stable 2025-10-07 23:04:04 !91189 2025-10-07 23:11:36 the failing test was disabled before the upgrade, now still enabled on more architectures than before 2025-10-07 23:17:22 not sure what is going on with arti on build-3.22-arhmf, perhaps wait till edge builds are done on a*? 2025-10-08 05:02:59 bionic_dl_iterate_phdr is behind an #ifdef 2025-10-08 05:03:19 https://tpaste.us/OvW8 2025-10-08 20:02:42 hmm, cert is valid until Oct 5th 2025-10-08 20:08:46 Ok, now I understand 2025-10-08 20:08:57 The test should have failed because the cert was valid way to long 2025-10-08 20:09:09 but becuase it expired now, it no longer fails 2025-10-08 21:58:46 algitbot: retry master 2025-10-08 22:03:56 algitbot: retry master 2025-10-08 22:07:29 algitbot: retry master 2025-10-08 22:58:39 algitbot: retry master 2025-10-08 23:02:51 algitbot: retry master 2025-10-08 23:14:09 algitbot: retry master 2025-10-08 23:19:41 algitbot: retry master 2025-10-08 23:45:58 algitbot: retry master 2025-10-09 00:21:07 algitbot: retry master 2025-10-09 01:49:57 algitbot: retry master 2025-10-09 01:57:21 algitbot: retry master 2025-10-09 12:53:52 algitbot: retry master 2025-10-09 13:38:39 mio: should we create an issue for that failing test? 2025-10-09 13:40:19 ikke: can do if you think it'll be helpful. maybe for maintainer to decide on any further course of action 2025-10-09 13:40:38 I was wondering why this is only failing on aarch64 2025-10-09 13:41:05 does it use more threads? 2025-10-09 13:41:25 The builder does have a lot of cores, so possibly 2025-10-09 13:41:27 + 2025-10-09 13:41:34 CI has less 2025-10-09 13:42:02 the error message seems to indicate reducing thread count might help 2025-10-09 13:42:54 Indeed 2025-10-09 13:50:33 #17599 2025-10-09 13:51:44 Thanks 2025-10-09 13:52:11 you're welcome, thanks for your input 2025-10-09 15:29:51 mio: ^ 2025-10-09 15:30:06 it was before the fix 2025-10-09 15:30:08 right 2025-10-09 15:30:11 s/fix/skip/ 2025-10-09 15:33:59 considered patching the test to set threads, maybe andypost's suggestion would be better 2025-10-09 15:34:25 Yeah, but we have to be careful not to negate the goal of the test 2025-10-09 15:34:52 yeah 2025-10-10 16:47:48 ikke: any thoughts about distrobuilder's failed test? seems to be having trouble accessing a host 2025-10-10 16:48:11 Failed to run: gpg --homedir /tmp/--/gpg.2788625260 --keyserver keyserver.ubuntu.com --recv-keys [...] 0x5DE8949A899C8D99: gpg: keyserver receive failed: Host is unreachable 2025-10-10 16:48:12 checking.. 2025-10-10 16:48:19 only happens on riscv64 2025-10-10 16:48:40 thanks 2025-10-10 16:53:22 I think it's because it's getting an IPv6 address back while that server does not have ipv6 connectivity 2025-10-10 16:53:31 getent hosts keyserver.ubuntu.com 2025-10-10 16:53:33 2620:2d:4000:1007::70c keyserver.ubuntu.com keyserver.ubuntu.com 2025-10-10 16:55:48 what do you recommend? not sure it's going to be able to get ipv4 with retries 2025-10-10 16:56:26 I'm curious why it resolves to an ipv6 address in the first place 2025-10-10 16:56:27 or how long before it does 2025-10-10 16:57:46 seems to default to ipv6 when tried to ping it 2025-10-10 17:03:45 nvm about the ping, may just be my provider 2025-10-10 17:04:53 It's the resolver (getaddrinfo, musl) that decides what to return 2025-10-10 17:07:32 ah 2025-10-10 17:55:45 mio: heh, I cannot reach the host on ipv4 either.. 2025-10-10 17:56:14 not with ping 2025-10-10 18:02:15 is the host reachable from a different ip range? 2025-10-10 18:03:06 mio: I cannot ping it from other servers either, so I suspect they are blocking ICMP :/ 2025-10-10 18:07:01 yeah, same. the test didn't seem to have any problem on e.g. x86_64, just wondered if they were blocking certain ranges 2025-10-10 18:07:18 accidentally 2025-10-10 18:08:19 First need to verify if it's trying to connect via ipv6. If that's the case, it's not going to work 2025-10-10 18:08:37 Then I have to find out why it's requesting AAAA records 2025-10-10 18:09:56 just tried to build it locally, no issue though couldn't ping either ipv4/6 2025-10-10 18:10:26 so looks like you're right and it's just blocking icmp 2025-10-10 18:10:27 mio: what does `getent hosts keyserver.ubuntu.com` return? 2025-10-10 18:10:44 2620:2d:4000:1007::d43 keyserver.ubuntu.com keyserver.ubuntu.com 2025-10-10 18:11:00 Does your system have ipv6? 2025-10-10 18:11:03 yes 2025-10-10 18:11:06 right 2025-10-10 18:13:14 mio: hmm 2025-10-10 18:13:32 If I run the gpg command on the builder now, it works 2025-10-10 18:15:36 it didn't work before? 2025-10-10 18:15:41 I don't know 2025-10-10 18:15:45 first time I tried it now 2025-10-10 18:15:57 But that's the command you said failed 2025-10-10 18:17:03 yeah 2025-10-10 18:17:42 https://build.alpinelinux.org/buildlogs//build-edge-riscv64/community/distrobuilder/distrobuilder-3.2-r0.log in context 2025-10-10 18:17:52 same on 3.1-r9 2025-10-10 18:18:54 ftr, I do see that on that host that the builders are on, the containers receive an address in a private (not link-local) range, there is something sending router advertisements 2025-10-10 18:19:06 but for some reason that does not affect the other build host in the same network 2025-10-10 18:19:33 in any case, the builder is now chugging on apache-arrow 2025-10-10 18:21:20 fwiw, the test would fail back in 1.24.x, and seems to later resolve itself 2025-10-10 18:22:08 on balance of probability, if okay to wait, maybe it will as well this time 2025-10-10 18:24:08 just not sure when or why 2025-10-10 18:29:35 okay. just ping if further action is advisable on my part 2025-10-10 18:30:01 ikke: ^ in case it's too long a wait 2025-10-10 18:55:11 now getting host is unreachable.. 2025-10-10 18:58:07 huh 2025-10-10 18:58:45 so there is gpg-agent process that keeps running 2025-10-10 18:59:00 the gpg command fails until after I kill that agent 2025-10-10 19:05:35 But in the tests it always fails 2025-10-10 19:38:25 interesting 2025-10-10 19:43:22 I have no energy to persue this further atm 2025-10-10 19:44:07 okay, thanks for checking 2025-10-10 19:44:39 no luck so far reproducing the error 2025-10-11 03:54:12 algitbot: retry master 2025-10-11 03:59:00 algitbot: retry master 2025-10-11 04:37:59 ikke: initially tried to direct the tests to another keyserver, but it didn't seem to help. conditionally skipped the tests to enable the builder to continue building in testing/. please feel free to revert the patch if you find a good fix 2025-10-11 04:39:04 for now it will try an initial fetch in each case and skip the rest of the test if the connection fails 2025-10-11 16:51:29 is build-edge-armv7 stuck? 2025-10-11 17:01:11 ok, I guess not 2025-10-11 17:46:08 kind of expected 2025-10-11 17:52:26 I'll fix it 2025-10-11 21:47:36 algitbot: retry master 2025-10-11 22:14:45 algitbot: retry master 2025-10-12 00:00:50 has something changed in the past 4h to s390x? 2025-10-12 00:00:57 "configure: error: C compiler cannot create executables" 2025-10-12 11:55:29 algitbot: retry master 2025-10-12 11:55:31 algitbot: retry 3.22-stable 2025-10-12 11:56:02 algitbot: retry master 2025-10-12 13:22:30 algitbot: retry 3.22-stable 2025-10-12 14:39:32 algitbot: retry 3.22-stable 2025-10-12 14:55:31 algitbot: retry-3.22-stable 2025-10-12 14:56:02 algitbot: retry 3.22-stable 2025-10-12 15:27:21 algitbot: retry 3.22-stable 2025-10-12 15:55:00 algitbot: retry 3.22-stable 2025-10-12 15:56:18 value driver.PathNotFoundError = driver.PathNotFoundError{Path:"/7.zb_u_ebm2hcl-qjr/ec6e41/9x_yb", DriverName:"inmemory"} ("inmemory: Path not found: /7.zb_u_ebm2hcl-qjr/ec6e41/9x_yb") 2025-10-12 17:01:50 algitbot: retry 3.22-stable 2025-10-12 17:43:57 algitbot: retry 3.22-stable 2025-10-12 18:32:45 algitbot: retry 3.22-stable 2025-10-12 18:59:20 ikke: is build-3-22-riscv64 stuck? 2025-10-12 19:02:26 possibly 2025-10-12 19:47:42 algitbot: retry 3.22-stable 2025-10-12 20:18:19 algitbot: retry 3.22-stable 2025-10-12 20:21:52 algitbot: retry 3.22-stable 2025-10-12 20:31:46 algitbot: retry master 2025-10-12 22:09:06 oups, missed that they weren't squashed 2025-10-13 08:10:36 ERROR: sqlcipher-libs-4.11.0-r0: trying to overwrite usr/lib/libsqlite3.so.0 owned by sqlite-libs-3.50.4-r1. 2025-10-13 08:10:38 ERROR: sqlcipher-libs-4.11.0-r0: trying to overwrite usr/lib/libsqlite3.so.3.50.4 owned by sqlite-libs-3.50.4-r1. 2025-10-13 08:13:31 broken by !91418 2025-10-13 08:45:42 !91423 2025-10-13 08:47:34 I'll leave it up to someone else to decide if this is the best way to handle the situation 2025-10-13 08:49:12 ffs 2025-10-13 08:50:07 ok, maybe I won't 2025-10-13 08:53:26 algitbot: retry master 2025-10-13 08:55:03 algitbot: retry master 2025-10-13 09:05:23 algitbot: retry master 2025-10-13 09:07:24 algitbot: retry master 2025-10-13 09:13:22 algitbot: retry master 2025-10-13 09:15:22 but why 2025-10-13 09:15:23 algitbot: retry master 2025-10-13 09:16:41 algitbot: retry master 2025-10-13 09:18:21 ikke: do we need to also remove sqlcipher 4.11.0 packages or why aren't they picking up the downgraded one? 2025-10-13 09:21:23 algitbot: retry master 2025-10-13 09:47:06 algitbot: retry master 2025-10-13 10:33:02 algitbot: retry master 2025-10-13 19:16:21 algitbot: retry 3.21-stable 2025-10-13 22:12:08 algitbot: retry master 2025-10-14 10:09:26 what now... 2025-10-14 10:31:10 algitbot: retry master 2025-10-14 11:38:36 E OSError: [Errno 24] No file descriptors available: '/tmp/tmpy9aga0ci' 2025-10-14 11:38:48 on aws-cli 2025-10-14 11:38:52 algitbot: retry master 2025-10-14 11:42:31 algitbot: retry 3.22-stable 2025-10-14 11:45:04 wait, build-3-22-riscv64 was not done with the go 1.24.8 rebuilds? 2025-10-14 11:55:50 re aws-cli: guess it needs a few retries 2025-10-14 11:56:23 re go: seems so looking at https://pkgs.alpinelinux.org/packages?name=go&branch=v3.22&repo=&arch=riscv64&origin=&flagged=&maintainer= 2025-10-14 11:56:38 its impressive how slow it is 2025-10-14 16:06:15 algitbot: retry 3.22-stable 2025-10-14 20:46:47 oh damn, armv7 seems to actually be segfaulting consistently 2025-10-14 20:47:38 er, wait, no 2025-10-14 20:47:40 that's not segfault 2025-10-14 20:47:42 it's just OOM 2025-10-14 20:47:51 still annoying 2025-10-14 20:48:20 though i'm not sure what i can do to make it pass 2025-10-14 21:03:57 algitbot: retry master 2025-10-14 21:50:21 er 2025-10-15 03:01:10 algitbot: retry 3.22-stable 2025-10-15 15:22:27 algitbot: retry master 2025-10-15 15:39:39 error: variable 'dummy' is uninitialized when passed as a const pointer argument here 2025-10-15 15:39:42 vectorscan 2025-10-15 15:44:25 urgh 2025-10-17 17:14:53 oh for fucks sake 2025-10-17 17:19:08 algitbot: retry master 2025-10-17 18:32:20 git 2025-10-17 18:32:22 not ok 171 - large transaction creating branches does not burst open file limit 2025-10-17 18:32:24 not ok 172 - large transaction deleting branches does not burst open file limit 2025-10-17 18:32:39 lets skip tests for s390x on git 2025-10-17 18:32:56 though i am a little worried about s390x CI in apk-tools, we are seeing ICEs there 2025-10-17 18:33:23 ICE? 2025-10-17 18:33:29 internal compiler error 2025-10-17 18:33:32 right 2025-10-17 18:34:17 my guess is the image has one of the broken musl snapshots, which is what it hopefully is, because i don't want to deal with the s390x machine being cooked right now :p 2025-10-17 18:34:43 since i gather we would have to find a new contact inside IBM 2025-10-17 18:35:03 Ariadne: We did receive new contacts 2025-10-17 18:38:18 Ariadne: rebuilding the build-base image, checking if that will help 2025-10-17 18:39:59 thanks :) 2025-10-17 18:40:57 ugh, apk-tools bad signature 2025-10-17 18:49:18 Ariadne: Does not seem to help 2025-10-17 18:49:23 harrumph 2025-10-17 18:49:53 see https://wiki.postmarketos.org/wiki/Troubleshooting_Alpine_CDN_issues 2025-10-17 18:50:11 i purged the cache for all architectures now 2025-10-17 18:50:30 achill: I already did 2025-10-17 18:50:37 I wrote a tool for it 2025-10-17 18:50:50 (repo-tools in aports) 2025-10-17 18:51:01 repo-tools fastly purge pkg --release edge --origin apk-tools 2025-10-17 18:51:19 hmm then maybe rebuild? 2025-10-17 18:51:25 achill: it was fixed 2025-10-17 18:51:34 after purging 2025-10-17 18:51:36 ah 2025-10-17 18:51:37 nice 2025-10-17 19:04:37 hmm, git checks pass when I run them manually, even when I lower max open file limit 2025-10-18 18:51:45 algitbot: retry master 2025-10-19 08:59:30 algitbot: retry master 2025-10-19 09:12:55 algitbot: retry master 2025-10-19 09:51:34 algitbot: retry master 2025-10-19 13:31:52 algitbot: retry master 2025-10-19 13:32:23 algitbot: retry master 2025-10-19 16:18:54 bootstrapping ^ 2025-10-19 16:22:22 thanks! 2025-10-19 18:10:50 bootstrapping go on aarch64 2025-10-19 18:18:10 ikke: okay, thanks 2025-10-19 18:43:50 bootstrapping go on s390x 2025-10-19 18:53:36 thanks. checking roc-toolkit 2025-10-19 18:54:36 bootstrapping openjdk17 on s390x 2025-10-19 18:59:42 thanks ikke!! :3 2025-10-19 19:01:48 yw :) 2025-10-19 19:05:28 treesitter incompattiblity ^ 2025-10-19 19:51:31 bootstrapping ^ 2025-10-20 05:00:26 algitbot: retry master 2025-10-20 05:01:35 thanks! 2025-10-20 05:03:21 need to upgrade gcc on rv64 2025-10-20 05:03:58 algitbot: retry master 2025-10-20 05:09:26 bootstrapping ^ 2025-10-20 05:11:44 okay, thanks! 2025-10-20 07:43:01 ptrc: zimg needs cmake_minimum_required bump or 3.5 wrapper, there's also a new release https://github.com/sekrit-twc/zimg/compare/release-3.0.5...release-3.0.6 2025-10-20 13:51:14 algitbot: retry master 2025-10-20 15:48:42 algitbot: retry master 2025-10-20 15:50:10 algitbot: retry master 2025-10-20 15:51:30 algitbot: retry master 2025-10-20 15:52:08 algitbot: retry master 2025-10-20 15:54:01 algitbot: retry master 2025-10-20 15:55:42 algitbot: retry master 2025-10-20 16:14:43 algitbot: retry master 2025-10-20 17:50:44 anyone looking at zimb? 2025-10-20 17:50:46 zimg* 2025-10-20 17:57:54 Increased cmake level to 3.10, added samurai to makedepends, but now complains that test/extra/googletest/build/lib/libgtest.a does not exist 2025-10-20 17:59:11 !91807 2025-10-20 17:59:21 andypost[m] is looking into it 2025-10-20 17:59:36 ack 2025-10-20 18:50:58 hope php82 is fixed now 2025-10-20 18:53:29 Would be good to find out why they fail on the builders only 2025-10-20 19:09:41 algitbot: retry master 2025-10-20 19:52:24 algitbot: retry master 2025-10-21 07:14:37 algitbot: retry master 2025-10-21 08:33:50 algitbot: retry master 2025-10-21 15:16:32 algitbot: retry 3.20-stable 2025-10-21 18:49:33 algitbot: retry 3.22-stable 2025-10-21 18:50:14 Would be good to find out why they fail on the builders only 2025-10-21 18:50:17 oops 2025-10-21 18:50:28 algitbot: retry 3.22-stable 2025-10-21 18:53:43 ? there was a checksum change 2025-10-21 18:54:17 not just on builders, think they were backporting the fix 2025-10-21 18:54:38 mio: the loongarch 3.22 builder was set to fetch from distfiles edge 2025-10-21 18:55:01 mio: that first message was an accidental repeat 2025-10-21 18:55:27 ah okay 2025-10-21 22:58:43 algitbot: restart 3.20-stable 2025-10-21 22:58:55 algitbot: retry 3.20-stable 2025-10-21 23:34:53 algitbot: retry master 2025-10-21 23:37:20 algitbot: retry master 2025-10-22 09:09:31 algitbot: retry master 2025-10-22 09:10:11 * algitbot: retry master 2025-10-22 11:14:52 algitbot: retry master 2025-10-22 11:21:13 * algitbot: retry master 2025-10-22 12:36:38 I wonder if the mautrix-signal build failure would have been caught in CI 2025-10-22 12:36:39 aaeb79aa9e8b8093704758d248da588d5c0ef590 2025-10-22 13:01:46 py3-puppeteer is blocked by chromium not being rebuilt against newest libsimdutf 2025-10-22 13:03:01 there's a chromium MR in the queue 2025-10-22 13:03:03 !91889 built successfully with !91819 for at least x86_64 https://gitlab.alpinelinux.org/selfisekai/aports/-/jobs/2062991 2025-10-22 13:03:06 yes 2025-10-22 13:03:24 yes, the rebuild doesn't include chromium and electron 2025-10-22 13:03:33 would guess those are meant to follow separately 2025-10-22 13:03:49 but I'm also thinking if we should temporarily disable py3-puppeteer until new chromium is built 2025-10-22 13:04:02 to unblock packages being uploaded 2025-10-22 13:04:14 (if I'm not having this backwards again) 2025-10-22 13:06:20 had a similar idea, if chromium upgrade goes well probably no need 2025-10-22 13:06:57 yes, but I'm not so much worried that it won't go well as that it will take a long time 2025-10-22 13:07:07 for it to complete 2025-10-22 13:07:24 mautrix-signal could also use unblock 2025-10-22 13:07:30 yes 2025-10-22 13:08:44 okay. no opinion, whichever you think is best 2025-10-22 13:31:33 jellyfin-web should be easy 2025-10-22 16:19:27 algitbot: retry master 2025-10-22 17:47:25 algitbot: retry master 2025-10-24 00:37:41 algitbot: retry master 2025-10-24 00:45:04 aalgitbot: retry master 2025-10-24 00:46:07 aalgitbot: retry master 2025-10-24 00:54:33 aalgitbot: retry master 2025-10-24 01:37:59 algitbot: retry master 2025-10-24 01:46:58 algitbot: retry master 2025-10-24 13:51:28 don't the builders usually go to building main/ before building community/ or testing/? 2025-10-24 13:53:11 as in, after building/failing on any of the three, pull sources and start from main/ 2025-10-24 13:55:12 just surprised that build-edge-x86_64 is building testing/ now and not main/ 2025-10-24 14:22:10 omni: looks no main was built, but community is 2025-10-24 14:24:25 algitbot: retry master 2025-10-24 14:25:34 algitbot: retry master 2025-10-24 14:27:12 algitbot: retry master 2025-10-24 14:49:05 now 2025-10-24 14:55:20 there 2025-10-25 11:49:21 algitbot: retry master 2025-10-25 11:50:46 algitbot: retry master 2025-10-25 11:52:48 algitbot: retry master 2025-10-26 14:25:36 urgh 2025-10-28 11:40:01 Is there any way to notify the maintainers of these packages in #17558? 2025-10-28 11:50:19 qaqland: ping each maintainer? 2025-10-28 12:04:02 achill: yeah, but not sure if that's too noisy 2025-10-28 12:04:48 no should be fine, they are the maintainers 🤷 2025-10-28 12:05:24 i know people who appreciate getting pinged by that stuff, since that allows people to get notified and involved for the packages they maintain, instead of a few people doing it for most packages alone 2025-10-28 12:06:36 got it 2025-10-28 12:22:34 appreciate the ping, i'll take a look at my packages in a moment :3 2025-10-28 21:58:26 are we building again or is it just s390x? 2025-10-28 21:58:59 w00t! \:D/ 2025-10-28 21:59:05 meh 2025-10-28 21:59:34 build.a.o/distfiles is having a hard time 2025-10-28 22:05:15 on it 2025-10-28 22:05:22 ok 2025-10-28 22:05:34 sorry, did not see :) 2025-10-28 22:05:44 CI would have :p 2025-10-28 22:05:50 i didn't push it 2025-10-28 22:05:52 but yes 2025-10-28 22:09:53 I read it wrong and thought for a second that it was you who had left the checksum line there and that that was what you did not see 2025-10-28 22:10:48 I wrote "on it" two seconds before you pushed, and nothing to be sorry for either way 2025-10-28 22:12:53 Ariadne: and I meant that CI would have caught what you fixed, had it gone that route 2025-10-29 10:30:40 edk2 on loongarch64: 2025-10-29 10:30:42 build.py... 2025-10-29 10:30:42 : error 2000: Invalid parameter 2025-10-29 10:30:43 Invalid ARCH specified. [Valid ARCH: IA32 X64 EBC AARCH64 RISCV64] 2025-10-29 18:34:08 algitbot: retry master 2025-10-30 10:37:48 too bad it's not possible to get to the log 2025-10-30 11:10:04 algitbot: retry master 2025-10-30 11:12:14 omni: now it works 2025-10-30 11:37:03 and builds 2025-10-30 15:09:00 I hope that'll pass on build-3-22-x86_64, I'll be away for a few hours 2025-10-30 20:16:10 algitbot: retry master 2025-10-30 20:24:35 algitbot: retry master 2025-10-30 20:29:41 algitbot: retry master 2025-10-30 20:34:13 algitbot: retry master 2025-10-30 20:58:39 =( 2025-10-30 21:00:39 it build on aarch64, let's see if the tests pass on a second run... 2025-10-30 21:13:00 hmm... 2025-10-30 21:13:18 failures: 2025-10-30 21:13:18 ---- test::tokio_native_tls_tests::simple_tls stdout ---- 2025-10-30 21:13:18 d/src/thread/mod.rs:729:29: 2025-10-30 21:13:18 failed to spawn thread: Os { code: 11, kind: WouldBlock, message: "Resource temporarily unavailable" } 2025-10-30 21:13:18 thread 'test::tokio_native_tls_tests::simple_tls' panicked at /rustc/1159e78c4747b02ef996e55082b704c09b970588/library/st 2025-10-30 21:13:20 note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace 2025-10-30 21:13:29 algitbot: retry master 2025-10-31 12:49:11 algitbot: retry master