2024-05-01 00:21:56 algitbot: retry master 2024-05-01 00:27:03 I wonder if we should get maintainers involved in fixing the build failures 2024-05-01 00:33:09 not a bad idea 2024-05-01 00:33:43 they may also be better at fixing the issues and should probably be posted anyway 2024-05-01 00:33:43 Hi, omni 2024-05-01 00:33:47 hi 2024-05-01 00:34:09 Hmm, it's May already (noticed while checking lots on irclogs.a.o) 2024-05-01 00:34:14 s/lots/logs/ 2024-05-01 00:35:36 it is 2024-05-01 00:35:53 and this is the first channel with a *-2024-05.log file :) 2024-05-01 00:40:49 I noticed py3-meson-python has no Maintainer 2024-05-01 00:42:54 Hahaha 2024-05-01 00:43:26 "FileNotFoundError: [Errno 2] No such file or directory: 'crap'" 2024-05-01 00:43:31 When building py3-meson-python 2024-05-01 00:43:38 nice 2024-05-01 00:44:39 Apparently, lots of tests there require Git (tried it out in a container without that installed, and got a lot more errors than what i saw on build.a.o) 2024-05-01 00:48:09 speaking of crap, my fan spun up when I clicked the search thing in the crappy github web ui crap 2024-05-01 00:48:31 they have really bloated that web page 2024-05-01 00:49:43 it took a while to get to input some letters in the search box 2024-05-01 00:50:20 They're probably sending your search query to an AI 2024-05-01 00:50:47 that shouldn't spin my fan up 2024-05-01 00:51:04 that tab was suddenly taking all of my system resources 2024-05-01 00:51:31 Qutebrowser? 2024-05-01 00:53:02 yes, with qt5-qtwebengine because qt6-qtwebengine is worse 2024-05-01 00:59:49 algitbot: retry master 2024-05-01 01:06:35 Hmm.... 2024-05-01 01:06:48 tracker was upgraded to 3.7.1 but tracker-miners stayed at 3.6.2 2024-05-01 01:06:51 No wonder it is failing 2024-05-01 01:41:31 algitbot: retry master 2024-05-01 01:43:52 algitbot: retry master 2024-05-01 01:46:22 algitbot: retry master 2024-05-01 01:46:51 algitbot: retry master 2024-05-01 01:46:59 algitbot: retry master 2024-05-01 01:51:43 algitbot: retry master 2024-05-01 01:52:23 algitbot: retry master 2024-05-01 01:54:05 algitbot: retry master 2024-05-01 01:55:04 algitbot: retry master 2024-05-01 02:39:41 algitbot: retry master 2024-05-01 03:34:44 algitbot: retry master 2024-05-01 03:43:28 31 aports left for s390x community, 36 for armhf, 80 for armv7, and 126 for aarch64 2024-05-01 03:54:42 I hope x86 isn't stuck on google-cloud-cpp 2024-05-01 03:57:04 jellyfin: "The specified RuntimeIdentifier 'alpine.3.20.0-arm' is not recognized" 2024-05-01 04:03:45 25 aports left for s390x community, 35 for armhf, 78 for armv7, and 118 for aarch64 2024-05-01 04:21:06 Ok, x86 has moved on now, so it isn't stuck 2024-05-01 04:40:43 I wonder if we'll be able to see <100 packages for aarch64 by the end of the day 2024-05-01 05:29:27 22 aports left for s390x community, 28 for armhf, 71 for armv7, and 103 for aarch64 2024-05-01 05:46:03 100 aports to go for aarch64! 2024-05-01 05:49:33 bootstrapping openjdk17 for aarch64 2024-05-01 21:03:56 algitbot: retry master 2024-05-01 21:04:41 algitbot: retry master 2024-05-02 04:00:56 Last 10 aports for s390x community!! 2024-05-02 04:20:48 I wonder if spotifyd will be fixed by https://github.com/Spotifyd/spotifyd/pull/1220 (it removes the rustc-serialize that's failing, but that's just the side effect of doing some other changes, that look quite major) 2024-05-02 04:51:54 ^ with that, the aports left for s390x community is now in the single-digit 2024-05-02 04:52:06 that = gifski, not the tpm2 thing.. 2024-05-02 04:57:27 Time to enumerate the failing aports for s390x 2024-05-02 04:58:38 1. git-interactive-rebase-tool 2. gpick 3. java-lz4 4. kaidan 5. libsurvive 6. massif-visualizer 7. plasma-dialer 8. tpm2-abrmd 9. tpm2-tss-engine 2024-05-02 05:02:40 AUR's libsurvive-git package mentions the sciplot error that we're also having, same thing with AUR's gpick-git that mentions the boost error we're having 2024-05-02 05:02:51 Doesn't look like they were fixed though 2024-05-02 05:04:47 The tpm2 aports have !64820 2024-05-02 07:15:36 kaidan and plasma-dialer seem to require version 5 of Kirigami Addons, but i can't seem to find that in aports anymore 2024-05-02 07:16:22 massif-visualizer on the other hand, requires version 5 of KCharts that is available in kdiagram5 2024-05-02 07:17:34 If i am right about kdiagram5, then !65239 should fix massif-visualizer 2024-05-02 07:17:45 Don't know what to do about kaidan and plasma-dialer though 2024-05-02 07:18:58 So, out of the 9 failing aports on s390x, i think these are left: gpick, java-lz4, kaidan, libsurvive, plasma-dialer 2024-05-02 07:23:06 !64820 should also fix gnome-remote-desktop 2024-05-02 12:55:52 Grrrrrrrrrr 2024-05-02 12:56:17 When i saw the pytest upgrade, i wondered if it woulc cause some trouble 2024-05-02 12:56:20 and it has 2024-05-02 12:56:25 s/woulc/would/ 2024-05-02 12:56:29 :( 2024-05-02 12:56:39 <-- so annoyed by python breaking changes that i can't spell 2024-05-02 12:56:50 Never mind, we have py3-pytest7 2024-05-02 12:58:49 :) 2024-05-02 13:04:54 Now, two more aports are failing with pytest 8.2.0 2024-05-02 13:05:31 Don't think i can switch so many aports to pytest7, these 2 are affected by https://github.com/pytest-dev/pytest/issues/12263 2024-05-02 13:06:17 so, i will try adding this patch: https://github.com/tornadoweb/tornado/pull/3374 2024-05-02 13:13:20 It seems to work 2024-05-02 15:07:54 In other news, FreeBSD has reverted Pytest 8.2.0 2024-05-02 15:24:35 ^ pcc-libs (pcc.ludd.ltu.se (130.240.207.127:80): Host is unreachable) 2024-05-02 19:17:52 algitbot: retry master 2024-05-02 19:18:16 ftr, I'm bootstrapping on x86_64 2024-05-02 21:45:18 algitbot: retry master 2024-05-02 21:46:12 algitbot: retry master 2024-05-02 21:49:34 algitbot: retry master 2024-05-03 00:01:38 Has pytest 8.2.0 or rust 1.78.0 cause anything to go up in flames? 2024-05-03 00:01:45 caused* 2024-05-03 00:02:03 From a brief scroll through the backlog it seems the answer is no 2024-05-03 00:02:29 There was only mopidy that failed due to pytest 8.2.0, but i've already covered that with the py3-tornado patch i imported 2024-05-03 00:03:35 Which is also a reminder that maybe CI should be run for Python aport MRs that were created before pytest 8.2.0 was merged (and so didn't run CI with that) 2024-05-03 00:10:59 tbf i expected the pytest upgrade to be more breaking than rust 2024-05-03 00:11:08 also, hm, you're right 2024-05-03 00:11:30 i guess anything that had its last CI run more than a *while* ago should be retried 2024-05-03 00:11:49 FreeBSD has already reverted pytest 8.2.0 2024-05-03 00:19:54 algitbot: retry master 2024-05-03 00:40:19 algitbot: retry master 2024-05-03 00:42:47 algitbot: retry master 2024-05-03 00:58:06 ^ rnp: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1064486 (mentions the same tests that fail once py3-setuptools has been added to makedepends) 2024-05-03 00:58:16 No fix though 2024-05-03 01:19:04 algitbot: retry master 2024-05-03 01:20:19 I think libsurvive is not compatible with sciplot 0.3.x 2024-05-03 01:25:50 algitbot: retry master 2024-05-03 01:26:11 algitbot: retry master 2024-05-03 01:32:40 algitbot: retry master 2024-05-03 04:49:52 algitbot: retry master 2024-05-03 04:50:27 algitbot: retry master 2024-05-03 04:50:49 Flaky tests :/ 2024-05-03 04:52:39 and the worst thing is, not only is it flaky, if it fails, it outputs a huge log file 2024-05-03 04:53:56 algitbot: retry master 2024-05-03 04:55:42 which may be one argument for not using the community/starship method of automatically re-running the tests if it fails 2024-05-03 04:56:14 (we do not want to keep the huge log from the failed runs) 2024-05-03 05:01:07 Ouch, armv7 is building erlang and sbcl at the same time 2024-05-03 05:03:32 I wonder how much faster sbcl would build if it was built by itself 2024-05-03 05:03:52 (now it's built by ecl) 2024-05-03 05:18:29 FindKChart.cmake not found ^ 2024-05-03 05:20:26 I have an MR for that 2024-05-03 05:20:30 The KChart thing 2024-05-03 05:21:12 It's the kaidan and plasma-dialer failure that is worrying 2024-05-03 05:21:39 because Kirigami Addons seem to be available only for KDE6 now 2024-05-03 05:22:14 ^ !65239 2024-05-03 05:23:46 ^ gpick probably isn't compatible with boost1.84 2024-05-03 05:26:31 Similar error reported for boost 1.72: https://github.com/thezbyg/gpick/issues/182 2024-05-03 05:32:54 algitbot: retry master 2024-05-03 07:37:56 algitbot: retry master 2024-05-03 07:43:10 algitbot: retry master 2024-05-03 08:22:27 Now x86 has reached less than 200 aports left for 3.20 community :) 2024-05-03 13:59:16 ansible-lint checksums has changed yet again 2024-05-03 17:43:34 algitbot: retry master 2024-05-03 18:00:12 algitbot: retry master 2024-05-03 23:47:39 algitbot: retry master 2024-05-04 00:05:53 algitbot: retry master 2024-05-04 00:23:48 algitbot: retry master 2024-05-04 00:33:30 algitbot: retry master 2024-05-04 02:36:39 I have a feeling build-3-20-aarch64 may reach 25 aports today 2024-05-04 02:36:49 aports left* 2024-05-04 02:37:16 and build-3-20-x86 could probably be much lower too if it wasn't stuck on py3-tqdm 2024-05-04 02:37:51 With riscv64 and x86_64 probably being stuck too 2024-05-04 02:38:26 i think ppc64le will be the next arch to reach less than 500 aports left 2024-05-04 02:38:42 and from there hopefully on to less than 100 :) 2024-05-04 04:42:54 So, the latest status is: 10 aports left for s390x community, 12 for armhf, 18 for armv7, 24 for aarch64, 79 for x86 (stuck), and 709 for ppc64le 2024-05-04 04:47:21 I'm quite sure s390x was at 8 or 9 aports left about 2 days ago 2024-05-04 04:47:27 I wonder how the number went back up to 10 2024-05-04 04:48:06 Did a new failing aport get added? or more likely, that new aport added to the queue depends on one of the failing aports.. 2024-05-04 06:30:02 Uh oh, ppc64le is building keycloak-config-cli again (iirc, the last time it got stuck on this) 2024-05-04 11:32:26 oh? 2024-05-04 11:37:33 commented in !65329 2024-05-04 12:28:50 algitbot: retry master 2024-05-04 12:29:43 Seems to work 2024-05-04 12:30:40 algitbot: retry master 2024-05-04 13:14:49 algitbot: retry master 2024-05-04 14:31:02 algitbot: retry master 2024-05-04 14:34:02 algitbot: retry master 2024-05-04 14:36:07 Latest on the aports left for 3.20 community: 10 for s390x, 13 for armhf, 23 for armv7, 24 for aarch64, 34 for x86, 273 for ppc64le 2024-05-04 14:37:32 Hmm, x86 is back to 35, due to the py3-pytest-async upgrade needed for pytest 8.2.0 compatibility 2024-05-04 14:38:04 but that's probably not going to fail, and will be reached eventually with enough retries 2024-05-04 14:57:44 the black issue is due to pytest 8.2.0 2024-05-04 14:57:46 https://github.com/pytest-dev/pytest/issues/12278 2024-05-04 14:58:01 maybe the pytest upgrade should be reverted? 2024-05-04 14:58:23 The file descriptor not availble error is also due to pytest 8.2.0? 2024-05-04 14:58:36 yes 2024-05-04 14:59:15 Pytest has @fab as maintainer, maybe you should take maintainership so you can make such decisions (whether an upgrade should be reverted or not) 2024-05-04 14:59:38 fab is not really active and does not mind transferring maintainership 2024-05-04 15:00:05 alright i'll do an MR 2024-05-04 15:01:19 Anyway, i've already made a few pytest 8.2.0 compatibility commits, so will probably need to make sure the revert doesn't break those instead 2024-05-04 15:02:41 I think py3-tornado and py3-pytest-asyncio are 2 aports that i patched and upgraded for pytest 8.2.0 compatibility 2024-05-04 15:03:59 I think black has passed on build-edge-x86 though 2024-05-04 15:04:45 Race condition? 2024-05-04 15:05:09 yes 2024-05-04 16:17:27 Firefox has failed :O 2024-05-04 16:24:40 So, firefox-esr was built on edge with Rust 1.77 2024-05-04 16:25:58 algitbot: retry master 2024-05-04 16:26:41 algitbot: retry master 2024-05-04 17:16:04 algitbot: retry master 2024-05-04 18:55:18 algitbot: retry master 2024-05-04 19:45:21 bootstrapping openjdk11 on x86_64 2024-05-04 20:17:42 done 2024-05-04 20:17:50 algitbot: retry master 2024-05-04 21:55:53 algitbot: retry master 2024-05-05 00:53:58 algitbot: retry master 2024-05-05 04:06:30 x86_64 finally reached <1000 aports 2024-05-05 06:30:24 So, 6 architectures have reached 20 aports and below for 3.20 community :) 2024-05-05 06:30:46 Hopefully, x86_64 will join them soon.. 2024-05-05 06:32:14 I'm a bit worried about rv64 though 2024-05-05 11:46:01 why did s390x start rebuilding ceph17 before building apache-arrow 16.0.0? 2024-05-05 11:47:17 possibly: 1. Dependency cycle. 2. Dependecy in subpackage that cannot be detected 2024-05-05 11:49:39 ikke: could it be sorted manually so that we don't need to rebuild ceph17 again? 2024-05-05 11:50:14 like as in aborting the ceph17 build on s390x 2024-05-05 11:51:09 seems like it did the correct order on build-3-20-s390x 2024-05-05 11:51:16 ^ 2024-05-05 11:51:23 thanks 2024-05-05 11:52:15 thank you very much even =) 2024-05-05 11:52:24 ceph takes such a long time to build 2024-05-05 11:52:35 now I'll afk for a bit 2024-05-05 11:53:10 now it does build apache-arrow first 2024-05-05 11:54:13 I hope they will have the correct order on the 3-20 builders where it hasn't started yet 2024-05-05 11:55:18 ap builddirs does seem to indicate the correct order 2024-05-05 11:55:30 perhaps we could help out keeping watch to see that apache-arrow 16 has been built when any starts building cep17 or ceph18 2024-05-05 11:56:14 now afk 2024-05-05 15:46:45 Nice, 6 architectures are now at <15 aports left 2024-05-05 16:11:03 Hmm, it seems s390x just has 3 failing aports left 2024-05-05 16:11:19 They should be: 1. gpick 2. homer-app 3. java-lz4 2024-05-05 16:51:44 algitbot: retry master 2024-05-05 16:51:54 homer-app should be a simple fix, not? 2024-05-05 23:35:41 algitbot: retry master 2024-05-05 23:36:24 I think ezstream passed on other archs 2024-05-05 23:36:28 algitbot: retry master 2024-05-06 00:10:32 algitbot: retry master 2024-05-06 01:25:28 algitbot: retry master 2024-05-06 01:26:17 algitbot: retry master 2024-05-06 01:28:44 algitbot: retry master 2024-05-06 01:29:08 algitbot: retry master 2024-05-06 01:35:35 algitbot: retry master 2024-05-06 01:36:29 algitbot: retry master 2024-05-06 03:41:07 lol 2024-05-06 03:41:41 My connection is finally a bit better (still slow though, just not very slow) 2024-05-06 03:42:15 but good enough that i can finally see the "Merging, drum roll please" message in Gitlab 2024-05-06 03:49:52 To the 4 aports left in armhf, i think armv7 and ppc64le add 2 more (to make it 6), and aarch64 and x86 add 3 more on top of that (so 9) 2024-05-06 03:52:00 rkward is failing on aarch64, armv7, and x86; while ezstream seems to fail only on aarch64 2024-05-06 03:52:08 I'll try to see if i can find the others 2024-05-06 03:56:11 It seems Rust 1.78 broke ncspot, so that affects ppc64le but not x86 and the ARMs 2024-05-06 03:56:27 I mean, ppc64le is the one trying to build it with 1.78, the other 4 built it with 1.77 2024-05-06 03:57:12 neo4j is only enabled for aarch64 and x86_64 2024-05-06 03:57:42 and jellyfin only for aarch64, armv7, and x86_64 2024-05-06 04:02:10 cargo-edit tests segfault on x86, but seem ok on the others (it's not Rust 1.78 this time, as the others used that too) 2024-05-06 04:03:33 ppc-libs is only enabled for x86 and x86_64 2024-05-06 04:16:21 I wonder if aws-c-io and libei tests may be failing on riscv64 due to them timing out 2024-05-06 04:23:48 aarch64 adds to armv7's jellyfin and rkward: ezstream, neo4j, and very likely, jellyfin-web (as this depends on jellyfin, and is disabled for armv7) 2024-05-06 13:14:04 I wonder if we should just switch pcc-libs (and pcc) to distfiles 2024-05-06 13:14:29 but i think it would have to be distfiles.a.o/distfiles/v3.19, as they don't seem to be available in edge 2024-05-06 14:54:34 algitbot: retry master 2024-05-06 14:55:53 algitbot: retry master 2024-05-06 15:13:39 algitbot: retry master 2024-05-06 15:44:27 test 2024-05-06 15:46:02 test failed 2024-05-06 19:43:15 downterm 2024-05-07 00:10:49 algitbot: retry master 2024-05-07 00:14:30 algitbot: retry master 2024-05-07 00:25:15 algitbot: retry master 2024-05-07 00:31:09 algitbot: retry master 2024-05-07 00:51:15 Hmm, is redict also affected by #16056? but how did valkey solve it..seems like it just passed after enough retries? 2024-05-07 00:51:27 algitbot: retry master 2024-05-07 04:30:01 I think pg_probackup needs to be disabled for the same archs as postgresql15 2024-05-07 05:31:49 Time sensitive tests? 2024-05-07 05:31:50 algitbot: retry master 2024-05-07 06:06:16 algitbot: retry master 2024-05-07 07:13:10 algitbot: retry master 2024-05-07 11:29:12 helix: https://github.com/helix-editor/helix/commit/0546273570710b97e9eebfff84298afbbb372f02 2024-05-07 15:14:50 algitbot: retry master 2024-05-07 15:17:00 algitbot: retry master 2024-05-07 15:18:45 algitbot: retry master 2024-05-07 15:26:32 algitbot: retry master 2024-05-07 15:48:20 algitbot: retry master 2024-05-07 16:00:32 algitbot: retry master 2024-05-07 16:02:29 algitbot: retry master 2024-05-07 16:04:32 algitbot: retry master 2024-05-07 16:06:55 algitbot: retry master 2024-05-07 16:08:41 algitbot: retry master 2024-05-07 16:09:46 I think ppc64le only has rnp and virt-manager left that are failing 2024-05-07 16:10:15 algitbot: retry master 2024-05-07 16:11:31 algitbot: retry master 2024-05-07 16:11:49 Same with armhf 2024-05-07 16:12:46 armv7 additionally has jellyfin and rkward 2024-05-07 16:12:59 algitbot: retry master 2024-05-07 16:14:52 algitbot: retry master 2024-05-07 16:15:42 algitbot: retry master 2024-05-07 16:17:41 algitbot: retry master 2024-05-07 16:18:55 algitbot: retry master 2024-05-07 16:20:05 and aarch64 adds ezstream, jellyfin-web, and neo4j 2024-05-07 16:21:19 Hmm, 584 aports left for x86_64 2024-05-07 16:25:20 and 1962 for riscv64 2024-05-07 16:27:41 algitbot: retry master 2024-05-07 16:29:20 As for x86, it should be: cargo-edit, pcc, pcc-libs, rkward, rnp, virt-manager 2024-05-08 01:56:39 algitbot: retry master 2024-05-08 02:00:30 algitbot: retry master 2024-05-08 02:14:33 algitbot: retry master 2024-05-08 04:21:58 algitbot: retry master 2024-05-08 10:32:46 algitbot: retry 3-19-stable 2024-05-08 10:33:01 algitbot: retry 3.19-stable 2024-05-08 16:07:02 algitbot: retry master 2024-05-08 21:39:47 fitting name, code blocks 2024-05-09 01:41:37 Codeblocks is still blocking the builders :( 2024-05-09 01:42:00 Probably it should be renamed Builderblocks 2024-05-09 01:44:39 I'm thinking of just disabling it, with the reason that 13 volatile patches is too many, and a request to re-enable it can be made when that has been reduced to a manageable number (like 3), probably by merging all the small patches 2024-05-09 01:45:45 celie: ++ just file issue to unblock someday 2024-05-09 01:46:09 Not sure i get what you mean 2024-05-09 01:46:19 You mean disable it and file an issue? 2024-05-09 01:46:55 yes, usually disabled packages has ref to issue to enable 2024-05-09 02:30:56 #16098 2024-05-09 11:11:05 So, only "A" works? 2024-05-09 11:11:14 Cpr 200 2024-05-09 11:11:41 Cissue 300 2024-05-09 11:12:10 Tissue 404 2024-05-09 11:12:52 Zissue 404 2024-05-09 11:13:06 Cissue 403 2024-05-09 11:13:22 Cissue 404 2024-05-09 11:13:25 Cissue 405 2024-05-09 11:13:34 I think 404 just went 404 2024-05-09 11:14:09 Tissue 1234 2024-05-09 13:20:35 x86_64 has less than 400 aports left for 3.20 community 2024-05-09 13:21:00 Wonder if rv64 will make it 2024-05-09 13:23:30 Maybe with more (failing) aports disabled, it will make it 2024-05-09 14:02:02 x86_64 is building Deno again 2024-05-09 14:02:03 Let's hope it gets through this time 2024-05-09 16:42:21 Deno fails again :( 2024-05-09 16:42:31 This is the 3rd time, i think 2024-05-09 16:43:05 We should make an issue for jirutka 2024-05-09 16:45:18 I wonder if an upgrade will solve it, i saw Deno 1.43.2, but didn't feel up to making an MR for that, as the APKBUILD is rather involved 2024-05-09 16:46:09 Hopefully the new version works with Rust 1.78, or it'd open another can of worms (aarch64 has already built Deno successfully..but with Rust 1.77) 2024-05-09 16:52:53 should we just open an issue and disable it for now? 2024-05-09 16:53:16 algitbot: retry master 2024-05-09 16:54:26 ^ pcc-libs source website cannot be reached 2024-05-09 16:54:53 easy fix is to copy the source on distfiles to v3.20 2024-05-09 16:55:14 but from v3.19 2024-05-09 16:55:20 I think i checked edge, and they weren't there 2024-05-09 16:55:37 indeed 2024-05-09 16:55:55 done 2024-05-09 16:56:23 Probably pcc needs the same thing done too 2024-05-09 16:57:48 done 2024-05-09 16:58:08 Thanks :) 2024-05-09 16:59:05 rnp should be fixed by !65414 still awaiting a review from jirutka 2024-05-09 17:00:14 and virt-manager has #15924 with link to upstream issue 2024-05-09 17:00:38 (upstream has determined the issue is not caused by pytest 8, bit by new libvirt) 2024-05-09 17:00:45 s/bit/but/ 2024-05-09 17:02:01 rnp and virt-manager should be the last blockers for armhf 2024-05-09 17:02:50 armv7 additionally has jellyfin and rkward 2024-05-09 23:49:38 algitbot: retry master 2024-05-10 00:17:25 algitbot: retry master 2024-05-10 01:45:43 algitbot: retry master 2024-05-10 02:16:46 So, netpbm failed, followed almost immediately by openjdk21 (on build-3-20-x86_64) 2024-05-10 02:17:23 I saw netpbm on build.a.o, but missed seeing it in the "last error" column, and thought it had succeeded, apparently not 2024-05-10 03:37:49 build-3-20-x86_64 now has less than 300 aports to go 2024-05-10 03:38:00 Hopefully that will be 200 aports by the end of the day 2024-05-10 13:38:47 oh-no, I got the wrong date in that and the correct one is earlier 2024-05-10 13:47:47 What date? 2024-05-10 14:47:13 Hmm, today seems like a quiet day commits-wise 2024-05-10 14:50:00 build-3-20-x86_64 keeps trying (and failing) to build ghc 2024-05-10 15:02:03 algitbot: retry master 2024-05-10 15:02:40 algitbot: retry master 2024-05-10 15:02:51 algitbot: retry master 2024-05-10 15:03:44 algitbot: retry master 2024-05-10 15:05:39 algitbot: retry master 2024-05-10 15:06:36 algitbot: retry master 2024-05-10 15:11:37 algitbot: retry master 2024-05-10 15:13:10 build-3-20-x86_64 has 179 aports left to build 2024-05-10 15:39:46 algitbot: retry master 2024-05-10 16:00:57 algitbot: retry master 2024-05-10 16:03:05 build-3-20-x86_64 has landed on ghc again 2024-05-10 16:04:27 I probably won't be around to see if fail again (well.....i hope it doesn't) 2024-05-10 16:04:31 s/if/it/ 2024-05-10 16:20:40 algitbot: retry master 2024-05-10 16:50:19 nmeum: 6 2024-05-10 16:50:21 ^ 2024-05-10 17:41:06 algitbot: retry master 2024-05-10 20:09:53 ikke: the ghc failure looks like a python 3.12 issue or something, will try to have a look on the weekend 2024-05-11 00:46:55 algitbot: retry master 2024-05-11 00:49:41 algitbot: retry master 2024-05-11 01:15:23 algitbot: retry master 2024-05-11 01:43:06 algitbot: retry master 2024-05-11 01:44:35 perl-sys-cpu failed on riscv64 2024-05-11 01:45:04 That's one perl-* aport i didn't claim maintainership of, as i think it's been deleted from CPAN 2024-05-11 01:45:40 I think it's still a dependency of a few other aports though 2024-05-11 01:48:44 Looking at the tests file, it seems it is failing test 3 (cpu speed), and 4 (cpu type) 2024-05-11 01:49:18 Probably this is the first time tests are actually running on riscv64 2024-05-11 01:50:40 Seems like it is only required by community/zoneminder, which is disabled for riscv64 already 2024-05-11 01:50:48 So, maybe let's just disable perl-sys-cpu 2024-05-11 01:50:50 for riscv64 2024-05-11 01:54:11 Let me see if tests also fail in riscv64 CI 2024-05-11 01:55:30 Ok, it fails too 2024-05-11 01:58:07 Hmm, Debian fixed this by marking the 2 tests as "TODO" 2024-05-11 02:00:13 I just realized that the APKBUILD also disables tests for aarch64 2024-05-11 02:05:35 and yes, the test also fails on aarch64 2024-05-11 02:07:27 lol, from the perl-sys-cpu test file: "We start with some black magic to print on failure." 2024-05-11 03:57:26 algitbot: retry master 2024-05-11 04:04:57 riscv64 has reached <1000 aports left 2024-05-11 04:07:18 algitbot: retry master 2024-05-11 04:25:31 algitbot: retry master 2024-05-11 04:33:07 riscv64 has 977, and x86_64 has 176 left 2024-05-11 04:48:37 algitbot: retry master 2024-05-11 04:49:04 algitbot: retry master 2024-05-11 04:49:35 algitbot: retry master 2024-05-11 04:51:18 I wonder if we can get to <100 for x86_64 by the end of today 2024-05-11 05:29:16 algitbot: retry master 2024-05-11 07:28:31 159 aports left for x86_64 2024-05-11 07:38:20 With virt-manager failing tests disabled, now rnp should be the last blocker on armhf :) 2024-05-11 07:38:58 algitbot: retry master 2024-05-11 07:39:36 and maybe ppc64le too 2024-05-11 07:40:23 algitbot: retry master 2024-05-11 07:40:39 algitbot: retry master 2024-05-11 07:41:28 algitbot: retry master 2024-05-11 07:43:25 algitbot: retry master 2024-05-11 07:47:41 algitbot: retry master 2024-05-11 07:55:00 algitbot: retry master 2024-05-11 08:52:50 fix for rkward in !65773 2024-05-11 08:55:24 👍 2024-05-11 09:01:59 ncopa: haha, looks like we look at the same thing 2024-05-11 09:02:09 *looked 2024-05-11 09:02:27 nmeum: indeed. i in parallel was looking at ghc :) 2024-05-11 09:02:41 ncopa: your patch is missing the .data() invocation on the QByteArray though. this should fail check or at least it did for my patch 2024-05-11 09:03:11 oh ok. feel free to fix it 2024-05-11 09:03:20 im off for today now 2024-05-11 09:03:41 sure, have a nice day :) 2024-05-11 09:03:52 there is also the rnp, which should be reported upstream 2024-05-11 09:04:03 you too! thanks! 2024-05-11 09:04:17 rnp is fixed by !65414 2024-05-11 09:04:42 awesome! 2024-05-11 09:23:03 bootstrapping openjdk21 2024-05-11 13:19:14 algitbot: retry master 2024-05-11 13:40:07 algitbot: retry master 2024-05-11 13:51:32 I hope build-3-20-riscv64 and build-3-20-x86_64 are not both stuck 2024-05-11 13:53:30 x86_64 seems to be still going 2024-05-11 13:54:26 rv64 may be hanging 2024-05-11 14:02:28 algitbot: retry master 2024-05-11 15:37:24 algitbot: retry master 2024-05-11 16:45:50 algitbot: retry master 2024-05-11 17:06:27 algitbot: retry master 2024-05-12 01:56:35 algitbot: retry master 2024-05-12 02:31:25 Oh now, build-3-20-x86_64 started building Deno again 2024-05-12 03:16:29 algitbot: retry master 2024-05-12 03:16:51 901 aports left for riscv64 2024-05-12 03:16:55 community 2024-05-12 04:43:19 algitbot: retry master 2024-05-12 04:46:44 algitbot: retry master 2024-05-12 05:15:09 algitbot: retry master 2024-05-12 06:04:12 riscv64 is building py3-django-webpack-loader again 2024-05-12 06:08:16 I already told it not to do that, that naughty builder 2024-05-12 06:09:04 When that was built on edge, it took 2.5 hours 2024-05-12 06:09:37 There is progress, last time there was one '.' as the last output, now it's '..' 2024-05-12 06:11:06 I'll let it continue 2024-05-12 06:12:02 What about the WebkitGTKs on ARM builders? 2024-05-12 06:13:21 Having 2 copies of it being built on all 3 archs can't be a good thing.. 2024-05-12 06:13:25 nope 2024-05-12 06:14:24 Host is desperately looking for scrapes of free memory 2024-05-12 06:14:55 Do all 3 archs share the same host? 2024-05-12 06:15:14 yes 2024-05-12 06:15:20 Wow 2024-05-12 06:15:26 That would be 6 copies of WebkitGTK 2024-05-12 06:16:31 I'm rebooting the host 2024-05-12 06:16:34 If I can 2024-05-12 12:34:58 cely: is there any specific reason why you use a custom basename macro for frr instead of including libgen.h for musl's version? 2024-05-12 12:35:50 I think it's due to const vs non-const 2024-05-12 12:35:58 The libgen.h version is non-const 2024-05-12 12:36:36 yes, it modifies the input buffer but POSIX does not require it to remain unmodified 2024-05-12 12:37:02 does frr require it to remain unmodified? 2024-05-12 12:37:45 I think i was made aware of this after seeing the discussion in !62810 2024-05-12 12:40:16 I see 2024-05-12 12:40:44 and if you look through the log file for -r1 that you have in #16106 2024-05-12 12:40:57 for frr-10.0-r1 2024-05-12 12:41:01 and search for libgen.h 2024-05-12 12:41:17 You see a "warning: type basename does not match original declaration" 2024-05-12 12:41:42 Not sure why it included libgen.h automatically 2024-05-12 12:41:58 When earlier it gave it an implicit declaration instead 2024-05-12 12:43:14 Anyway, so when i've worked on patching basename, i check for const, and if i see it i patch it with the macro instead of including libgen.h 2024-05-12 12:44:04 well the mismatch thing is an lto error, so that probably happpens when optimzing across multiple translation units as some include libgen.h 2024-05-12 12:44:13 Ok 2024-05-12 12:44:43 while the implicit declaration is a warning emitted for the zebra_netns_notify.c translation unit which doesn't include libgen.h 2024-05-12 12:45:12 checking if the input string is expected to be const makes sense indeed 2024-05-12 12:45:16 thanks for clarifying! :) 2024-05-12 12:45:51 You're welcome 2024-05-12 14:04:15 Hmm, so J0WI has merged rnp 0.17.1, but sadly the ARM builders are......still fighting to build WebkitGTK 2024-05-12 14:05:17 and x86 still has cargo-edit tests which are segfaulting, so the only arch that was unblocked for 3.20 is ppc64le 2024-05-12 14:06:20 algitbot: retry master 2024-05-12 14:08:32 algitbot: retry 3.19-stable 2024-05-12 14:08:56 And I don't have OOB access to the builder atm, so I cannot force reboot it 2024-05-12 14:09:02 ssh is hanging 2024-05-12 14:09:18 WebkitGTK killed the builder :( 2024-05-12 14:10:45 CPU System time is 100% 2024-05-12 14:10:49 or close to 2024-05-12 14:10:59 How is Deno doing on build-3-20-x86_64? 2024-05-12 14:11:33 Not a lot is happening 2024-05-12 14:11:48 thread 'main' panicked at cli/lsp/tsc.rs:1171:65: 2024-05-12 14:11:56 called `Result::unwrap()` on an `Err` value: TypeError: serde_v8 error: invalid type; expected: object, got: string 2024-05-12 14:12:32 So a panic that causes it to hang 2024-05-12 14:13:05 algitbot: retry master 2024-05-12 14:17:53 Hm, that's ^ the Go rebuild, not 3.20 2024-05-12 14:18:05 *continues waiting* 2024-05-12 14:24:01 *The waiting intensifies* 2024-05-12 14:24:34 Our monitoring sporadically gets some data from the host 2024-05-12 14:24:47 for ARM? 2024-05-12 14:24:49 Yes 2024-05-12 14:25:02 Last what I see is ~30m ago 2024-05-12 14:25:57 There goes cargo-edit 2024-05-12 14:26:27 108 aports left for x86_64 3.20 community 2024-05-12 14:26:39 ... 2024-05-12 14:26:51 109 now with the addition of py3-channels 2024-05-12 14:28:33 Oh no! Deno again :( 2024-05-12 14:29:49 ACTION wonders if would should use something like earlyoom on the host 2024-05-12 14:29:54 if we* 2024-05-12 14:31:58 Speaking of earlyoom, i took a look at the latest upgrade (1.8.2), saw that it only modified the systemd service file, and decided to ignore it 2024-05-12 14:37:18 Phew, py3-stestr has finished building on riscv64 2024-05-12 14:39:49 Yeah, I see some projects even bumping versions for CI changes 2024-05-12 14:39:56 like, why 2024-05-12 14:43:41 rqlite just did that, released 5 versions in a day 2024-05-12 14:53:56 Ah, there we go 2024-05-12 14:55:02 I wonder if anyone expected s390x and ppc64le to be the first 2 archs to finish building 3.20 2024-05-12 14:57:12 Usually they have less to build 2024-05-12 14:57:24 Packages get easier disabled for those arches 2024-05-12 17:27:20 ooohg 2024-05-12 17:27:36 access again 2024-05-12 18:05:51 FYI, I've stopped the arm edge builders to prevent it from exhausting all memory again 2024-05-12 18:06:16 algitbot: retry master 2024-05-12 19:23:15 algitbot: retry master 2024-05-12 19:23:52 algitbot: retry master 2024-05-12 19:32:14 algitbot: retry master 2024-05-12 20:12:49 algitbot: retry master 2024-05-12 20:40:32 algitbot: retry master 2024-05-12 21:05:53 algitbot: retry master 2024-05-13 02:19:49 algitbot: retry master 2024-05-13 02:20:56 algitbot: retry master 2024-05-13 03:07:11 I don't think we can leave the builders in a failed state for so long 2024-05-13 03:07:18 So: 2024-05-13 03:10:28 algitbot: retry master 2024-05-13 03:31:46 Ok, why did s390x get stuck on gitlab-runner :/ 2024-05-13 03:34:18 Oh, phew...s390x on edge completed 2024-05-13 03:34:26 Hopefully, 3.20 will follow 2024-05-13 03:35:04 algitbot: retry master 2024-05-13 04:17:54 I think something is going on with the s390x CI runner 2024-05-13 04:25:20 algitbot: retry 3.19-stable 2024-05-13 04:27:59 Ok, so 5 archs (s390x, ppc64le, x86, armhf, and armv7) have uploaded 3.20 community now 2024-05-13 04:28:33 aarch64 still has a few blockers 2024-05-13 04:29:02 algitbot: retry master 2024-05-13 04:29:57 jellyfin, jellyfin-web, neo4j..still trying to see what the 4th aport for aarch64 is 2024-05-13 04:30:00 algitbot: retry master 2024-05-13 04:31:23 algitbot: retry master 2024-05-13 04:32:11 Ah, it's gitlab-runner, and that hopefully shouldn't fail 2024-05-13 04:32:16 It didn't 2024-05-13 04:32:46 So, 3 blockers left for aarch64: jellyfin, jellyfin-web, and neo4j 2024-05-13 04:33:09 and i'll re-enable jellyfin for armv7 after that has uploaded community 2024-05-13 04:37:06 and the Go rebuild for s390x has completed, with Dendrite finally passing 2024-05-13 04:48:02 algitbot: retry master 2024-05-13 04:49:06 algitbot: retry master 2024-05-13 04:49:33 algitbot: retry master 2024-05-13 04:56:45 algitbot: retry master 2024-05-13 05:00:46 90 aports left for x86_64 community 2024-05-13 05:16:08 78 aports left for x86_64 community 2024-05-13 05:17:20 ^ and the Go rebuild in 3.19 has completed 2024-05-13 05:40:44 algitbot: retry master 2024-05-13 05:44:29 algitbot: retry master 2024-05-13 05:52:20 So, neo4j passed on x86_64..but not on aarch64 2024-05-13 05:57:21 65 aports left for x86_64 community 2024-05-13 06:01:17 Passed on edge, failed on 3.20 2024-05-13 06:01:24 algitbot: retry master 2024-05-13 06:24:30 52 aports left for x86_64 community 2024-05-13 07:00:33 algitbot: retry master 2024-05-13 07:07:33 riscv64 finished gotosocial before s390x 2024-05-13 07:08:14 algitbot: retry master 2024-05-13 07:09:23 algitbot: retry master 2024-05-13 07:09:39 algitbot: retry master 2024-05-13 07:21:31 42 aports left for x86_64 community 2024-05-13 07:30:15 Finally <40 aports for x86_64 :) 2024-05-13 07:38:57 30 aports left for x86_64 community 2024-05-13 07:42:13 ^ s390x has finally completed gotosocial on edge 2024-05-13 08:00:37 Maybe it's time to bring build-edge-armhf back online? 2024-05-13 08:02:30 20 aports left for x86_64 community (not including the new KF 6.2.0 aports that were just merged) 2024-05-13 08:34:10 s390x is moving very slowly 2024-05-13 08:34:31 algitbot: retry master 2024-05-13 08:39:11 algitbot: retry master 2024-05-13 13:01:50 algitbot: retry master 2024-05-13 13:02:20 algitbot: retry master 2024-05-13 14:26:30 Oops, it seems riscv64 was retried before i pushed my fix? 2024-05-13 14:27:41 It will get it the next time then, unless the builder suddenly got a speed boost and can finish compiling Rust tests in 10 minutes 2024-05-13 17:19:59 algitbot: retry master 2024-05-13 17:34:00 I have a feeling that aarch64 will also fail to build Deno if !check is removed for it 2024-05-13 17:35:07 new aport* 2024-05-13 17:50:20 algitbot: retry master 2024-05-13 17:54:59 ^ sus 2024-05-13 18:02:08 algitbot: retry master 2024-05-13 18:03:59 algitbot: retry master 2024-05-13 19:21:44 algitbot: retry master 2024-05-13 22:00:42 fwiw, the last two commits don't actually rebuild anything 2024-05-14 01:24:56 algitbot: retry master 2024-05-14 03:24:44 algitbot: retry master 2024-05-14 03:25:43 algitbot: retry master 2024-05-14 03:27:27 algitbot: retry master 2024-05-14 03:29:53 algitbot: retry master 2024-05-14 03:34:12 Kakoune probably needs `make -j1 install` in package(), but so far, retrying has managed to make it pass on s390x and ppc64le 2024-05-14 03:34:14 algitbot: retry master 2024-05-14 03:44:36 Seems to have worked 2024-05-14 03:44:55 Let's see what build-edge-riscv64 says 2024-05-14 05:47:41 algitbot: retry master 2024-05-14 05:51:35 algitbot: retry master 2024-05-14 08:11:39 algitbot: retry master 2024-05-14 08:35:05 algitbot: retry master 2024-05-14 08:41:57 Ah, there's build-3-20-aarch64 having finished uploading community :) 2024-05-14 09:00:24 algitbot: retry master 2024-05-14 12:45:49 algitbot: retry master 2024-05-14 13:16:59 algitbot: retry master 2024-05-14 14:07:26 algitbot: retry master 2024-05-14 14:32:19 There goes Deno 2024-05-14 14:38:00 ACTION ponders what it would take to fix a 3rd Java aport (after java-lz4 and neo4j) 2024-05-14 15:19:56 build-3-20-x86_64 is back to building Deno 2024-05-14 15:20:49 cely: still expected to hang? 2024-05-14 15:21:32 I think so, i don't think we've changed anything 2024-05-14 15:21:35 ok 2024-05-14 15:22:01 I'm running 1.42.4 through CI, but i'm expecting some tests to fail 2024-05-14 15:31:22 algitbot: retry master 2024-05-14 15:48:40 I see a lot of failures in Deno 1.42.4 tests :( 2024-05-14 15:56:28 I see a lot more failure in 1.42.4 than 1.43.3 2024-05-14 16:11:15 cely: so that's progress, i guess? 2024-05-14 16:11:18 oh, no 2024-05-14 16:11:19 backwards 2024-05-14 16:12:52 What's backwards? 2024-05-14 16:13:15 I mean, .4 has more failures than .3, so not progress, its getting worse 2024-05-14 16:13:46 Major version, major version 2024-05-14 16:14:42 potato tomato 2024-05-14 16:15:19 I've recently typed digits in the wrong order quit eoften 2024-05-14 16:15:54 It's 1.42.4 vs 1.43.3 2024-05-14 16:16:23 I'm looking through old npm versions to find out which one causes the lone 1.43.3 failing test to fail 2024-05-14 16:17:34 Ah, reading is an art 2024-05-14 16:17:45 as is typing! 2024-05-14 16:20:39 Wish me luck 2024-05-14 16:21:30 Bingo 2024-05-14 16:21:47 The version is in between 10.5.0 to 10.7.0 2024-05-14 16:21:54 So, a new change, no wonder Deno upstream doesn't account for it 2024-05-14 16:21:56 🍀 2024-05-14 16:24:02 Narrowed it down to between npm 10.5.1 and 10.6.0 2024-05-14 16:24:29 Now to see if i can find the exact commit 2024-05-14 16:25:00 ACTION shakes head how test failures are related to the version of the package manager 2024-05-14 16:29:59 https://github.com/npm/cli/pull/7414 2024-05-14 16:32:01 what now then... 2024-05-14 16:32:06 algitbot: retry master 2024-05-14 16:34:32 So, it's just a s/ERR!/error/ 2024-05-14 16:48:12 algitbot: retry master 2024-05-14 17:04:44 Hmm, i didn't notice the previous Deno MR 2024-05-14 18:01:27 algitbot: retry master 2024-05-14 19:14:06 until now! 2024-05-14 19:29:01 omni: looks like deno is build 3rd day 2024-05-14 20:23:44 algitbot: retry master 2024-05-14 21:29:44 algitbot: retry master 2024-05-14 21:54:13 ( working on polishing loongarch64 stuff, none of these commits actually rebuild anything if anyone was worried :) ) 2024-05-14 23:12:56 algitbot: retry master 2024-05-14 23:14:05 algitbot: retry master 2024-05-15 00:36:23 algitbot: retry master 2024-05-15 00:57:16 So, Deno's npx non-existent test passed after i patched it but some other test failed instead: "specs::cert::cafile_install" 2024-05-15 00:58:49 "/tmp/deno-cli-testRs6M7M/./bin/echo_test: exec: line 3: deno: not found" 2024-05-15 00:59:01 algitbot: retry master 2024-05-15 01:03:41 algitbot: retry master 2024-05-15 01:41:57 algitbot: retry master 2024-05-15 01:53:16 algitbot: retry master 2024-05-15 01:54:23 algitbot: retry master 2024-05-15 01:56:57 algitbot: retry master 2024-05-15 01:59:05 algitbot: retry master 2024-05-15 02:14:24 algitbot: hi 2024-05-15 02:14:27 algitbot: retry master 2024-05-15 02:14:54 Wow, a lag time of 2 minutes 2024-05-15 04:53:51 algitbot: hi 2024-05-15 05:49:50 :) 2024-05-15 05:49:53 Finally uploaded 2024-05-15 05:50:21 Whrn riscv64 is done, it will be finally finally finally uploaded 2024-05-15 05:50:25 When* 2024-05-15 06:12:38 uhm... why didn't the aarch64 and armv7 builders start? 2024-05-15 07:31:06 algitbot: retry master 2024-05-15 12:13:09 Interesting, so apparently specs::cert::cafile_install runs "$builddir/target/release/deno install --cert RootCA.pem -n echo_test --root /tmp/ https://localhost:5454/echo.ts" 2024-05-15 12:14:58 Perhaps "$builddir/target/release" should be placed in $PATH so the test can find `deno` 2024-05-15 13:01:36 algitbot: retry master 2024-05-15 13:12:36 ah come on it were passing on the mr 😞 2024-05-15 14:10:16 algitbot: retry master 2024-05-15 14:12:54 algitbot: retry master 2024-05-15 14:38:01 algitbot: retry master 2024-05-15 14:38:09 should I disable traefik tests for s390x? 2024-05-15 14:39:32 That would be a question better answered by Traefik's maintainer 2024-05-15 14:39:46 Maybe you could try skipping only the test(s) that fail 2024-05-15 14:40:12 let me try 2024-05-15 15:38:54 no bueno, now its failing on another test 2024-05-15 15:53:40 I retried 3 times and it worked now 2024-05-15 16:06:50 algitbot: retry master 2024-05-15 16:14:21 Hmm, so i guess netpbm finally passed tests 2024-05-15 16:38:44 algitbot: retry master 2024-05-15 17:02:20 i sent the MR to skip the one that fails on ratelimiter but the one that happens on server seems important... but its very intermintent, sometimes pass sometimes not LOL 2024-05-15 20:42:26 algitbot: retry master 2024-05-16 00:40:12 algitbot: retry master 2024-05-16 03:49:03 algitbot: hi 2024-05-16 03:50:04 25 more aports left for riscv64 community 2024-05-16 03:57:37 nice 2024-05-16 03:59:05 12 now 2024-05-16 04:03:57 7 more! 2024-05-16 04:09:16 I think i'll just temp disable ffcall if it's the last blocker for riscv64 2024-05-16 04:10:18 ffcall is only used by clisp, which is x86_64-only anyway 2024-05-16 04:12:58 Well, maybe it won't be me doing that, as the builder is now working on scryer-prolog, which took over an hour the last time it built and that was without running tests (though i don't know if that was still using emulated riscv64) 2024-05-16 04:55:01 I wonder if we'll get 3.20-rc1 or reach 66000 MRs on gitlab.a.o first 2024-05-16 05:09:17 algitbot: retry master 2024-05-16 06:06:43 build-3-20-riscv64 is finally finally finally uploading community \o/ 2024-05-16 06:22:16 nice. 2024-05-16 06:35:45 that is quite a feat, great job 2024-05-16 09:03:58 algitbot: retry master 2024-05-16 11:19:58 ncopa: uhm, surely not 3.10 2024-05-16 11:31:48 Oops 2024-05-16 11:31:59 mkinitfs? 2024-05-16 11:32:10 -VERSION:= 3.9.1 2024-05-16 11:32:13 +VERSION:= 3.10.0_rc1 2024-05-16 11:32:20 Ah, so it's not aports 2024-05-16 11:32:41 nope 2024-05-16 11:33:06 builders are not idle anyway 2024-05-16 11:33:12 I was also wondering how aports could be tagged when the builders aren't idle 2024-05-16 11:33:14 Right 2024-05-16 12:04:33 Poor arti can't compete for resources with the 3 Chromiums+1 Deno 2024-05-16 12:05:21 ah that maybe was the reason for traefik failures yesterday 2024-05-16 12:08:51 Tasks: 893, 263 thr, 623 kthr; 80 running 2024-05-16 12:12:37 I hope it is not looking as bad as the time when it tried building 6 WebkitGTKs 2024-05-16 12:17:03 then I could not even access the host 2024-05-16 12:17:19 memory usage is low atm 2024-05-16 12:17:45 215G memory available 2024-05-16 12:18:19 last 12h minimum was 160G mem available 2024-05-16 12:19:15 Now 95% CPU User time instead of kernel 2024-05-16 12:19:37 algitbot: retry 3.19-stable 2024-05-16 12:20:05 The CPU is put to good use 2024-05-16 12:40:22 algitbot: retry master 2024-05-16 13:07:14 Thunderbird getting uploaded 2024-05-16 14:50:05 Now 4 copies of Chromium are being built by ARM 2024-05-16 16:30:02 ncopa: sorry, didn't see that it was mkinitfs 2024-05-16 16:35:21 Next target: will we get to 66050 MRs in Gitlab before 3.20-rc1 is released :) 2024-05-16 16:35:45 algitbot: retry master 2024-05-16 16:38:29 We already surpassed 65536 2024-05-16 16:38:52 Haha 2024-05-16 16:39:56 I wonder what that MR was about 2024-05-16 16:39:59 !65536 2024-05-16 16:40:08 lol 2024-05-16 16:40:17 I want send neovim 0.10 but Im looking to test and tests are failing 😅 2024-05-16 16:42:18 sad, just a closed MR 2024-05-16 16:43:35 Oh, now i looked closer 2024-05-16 16:44:00 It was probably an automatic MR creation fail 2024-05-16 16:44:33 I thought it was my MR (from the title) 2024-05-16 17:05:57 algitbot: retry master 2024-05-16 17:18:05 just fails 2024-05-16 17:18:56 just fix it 2024-05-16 17:20:29 my gitea also just fails on riscv64 timeout from npm 2024-05-16 17:57:44 just passed 2024-05-16 19:46:24 omni: no worries! 2024-05-17 00:37:41 Next target: will we reach !66100 MRs before 3.20-rc1 :) 2024-05-17 01:35:03 algitbot: retry master 2024-05-17 07:12:53 lol 2024-05-17 07:13:05 Everytime i see "rc1" my heart skips a beat 2024-05-17 07:13:19 Then i look closer and it isn't aports 2024-05-17 07:13:32 haha 2024-05-17 10:35:53 ^ sfdisk not found 2024-05-17 10:37:44 algitbot: retry master 2024-05-17 10:39:40 +/dev/dasda is not a block device 2024-05-17 10:39:42 +/dev/dasdb is not a block device 2024-05-17 10:39:52 for s390x 2024-05-17 12:03:47 algitbot: retry master 2024-05-17 15:17:34 algitbot: retry master 2024-05-17 15:48:23 algitbot: retry master 2024-05-17 16:56:21 algitbot: retry master 2024-05-17 18:27:23 algitbot: retry master 2024-05-17 18:44:38 algitbot: retry master 2024-05-17 18:45:46 algitbot: retry master 2024-05-18 00:30:21 algitbot: retry master 2024-05-18 01:39:38 algitbot: retry master 2024-05-18 03:12:09 algitbot: retry master 2024-05-18 03:22:33 algitbot: retry master 2024-05-18 03:23:04 algitbot: retry master 2024-05-18 03:26:31 algitbot: retry master 2024-05-18 03:27:46 algitbot: retry master 2024-05-18 03:31:11 Why are manpages being converted to text :/ 2024-05-18 03:44:03 algitbot: retry master 2024-05-18 03:55:27 algitbot: retry master 2024-05-18 04:53:38 algitbot: retry master 2024-05-18 04:59:07 algitbot: retry master 2024-05-18 05:01:30 algitbot: retry master 2024-05-18 05:15:40 algitbot: retry master 2024-05-18 05:53:35 algitbot: retry master 2024-05-18 06:29:50 maybe git-lfs2 isn't needed anymore? 2024-05-18 06:34:39 Maybe you'll have to ask on Gitlab 2024-05-18 06:41:16 !66120 2024-05-18 07:25:23 algitbot: retry master 2024-05-18 09:36:18 algitbot: retry master 2024-05-18 09:43:52 algitbot: retry master 2024-05-18 09:59:58 I'm holding off merging !66120 to give jirutka a chance to have a say, but bratkartoffel, who originally reported the issue with git-lfs v3, doesn't need it anymore so it would probably be fine to merge if needed to get things moving 2024-05-18 11:33:19 algitbot: retry master 2024-05-18 13:17:42 algitbot: retry master 2024-05-18 14:17:10 algitbot: retry master 2024-05-18 14:28:23 algitbot: retry master 2024-05-18 14:40:06 Seems like it fails due to upgraded git? 2024-05-18 14:40:21 fatal: active `post-checkout` hook found during `git clone` For security reasons, this is disallowed by default 2024-05-18 14:43:20 Probably 2024-05-18 14:43:29 There's already an MR to remove git-lfs2 2024-05-18 14:45:05 !66120 2024-05-18 16:52:33 algitbot: retry master 2024-05-18 17:05:34 ^ note to self: check out what's wrong later on, but for now i don't want to block the 3.20 builders (other archs didn't encounter this problem with tests) 2024-05-18 17:10:58 algitbot: retry master 2024-05-18 17:12:06 algitbot: retry master 2024-05-18 17:17:50 algitbot: retry master 2024-05-18 17:19:06 algitbot: retry master 2024-05-18 17:20:27 I think apparmor needs a pkgrel bump (i was surprised that i retried master and the builders immediately went back to community) 2024-05-18 17:20:52 yes, it would 2024-05-18 17:21:39 I'm working on it 2024-05-18 17:24:38 algitbot: retry master 2024-05-18 17:27:57 algitbot: retry master 2024-05-18 17:28:26 build-3-20-aarch64 is back to building py3-virtualenv 2024-05-18 17:28:33 Oh, and it passed this time 2024-05-18 17:29:33 algitbot: retry master 2024-05-18 17:42:56 algitbot: retry master 2024-05-18 17:55:07 algitbot: retry master 2024-05-18 18:00:29 algitbot: retry master 2024-05-18 18:10:27 algitbot: retry master 2024-05-18 18:54:12 w00t 2024-05-18 20:53:27 ^ checksum missing 2024-05-18 21:13:22 sigh 2024-05-18 21:13:55 right 2024-05-18 23:29:23 algitbot: retry master 2024-05-19 00:46:19 algitbot: retry master 2024-05-19 02:34:44 algitbot: retry master 2024-05-19 02:56:33 9 more aports left for x86_64 testing 2024-05-19 02:56:54 (mostly from the Go rebuild) 2024-05-19 03:21:09 Finally uploaded :) 2024-05-19 11:31:59 OOM victim 2024-05-19 11:38:15 Out of memory: Killed process 31993 (cc1plus) total-vm:3210140kB, anon-rss:2967356kB, file-rss:0kB, shmem-rss:0kB, UID:1000 pgtables:6224kB oom_score_adj:0 2024-05-19 11:39:16 algitbot: retry master 2024-05-19 11:39:23 now that build-edge-aarch64 is done with it 2024-05-19 11:58:44 *sigh* 2024-05-19 11:59:03 with build-edge-x86_64 is done with it... 2024-05-19 13:01:04 algitbot: retry master 2024-05-20 05:22:08 Oops, SQLite tests are now running on 16 archs at the same time 2024-05-20 05:22:10 ACTION keeps fingers crossed 2024-05-20 05:26:40 :O 2024-05-20 05:26:43 s390x is the fastest? 2024-05-20 05:27:38 does the sqlite test suite run all tests on all arches? did it just skip a bunch of tests? 2024-05-20 05:27:57 (I could look, but I'm lazy) 2024-05-20 05:28:56 Let me see, it does print a summary of the number of tests ran 2024-05-20 05:29:18 s390x: "0 errors out of 377329 tests" 2024-05-20 05:30:52 Same number on x86_64 2024-05-20 05:30:59 Oh wait 2024-05-20 05:31:05 It's 377339 for x86_64 2024-05-20 05:32:52 So, 10 tests less than most of the other archs 2024-05-20 05:35:48 (Tests are parallelized for riscv64, i wonder if that made it slower this time (as 2 copies of the test suite are running at the same time)) 2024-05-20 05:36:52 I hope it is not stuck on build-edge-riscv64 2024-05-20 05:37:13 Oh, thankfully 2024-05-20 05:39:28 So, criteria for moving to community (running the test suite), check 2024-05-20 05:40:42 Whether the tools can be built as part of main/sqlite is probably something for after 3.20, as it would very likely involve changing source tarball and resolving a circular dep between it and tcl, so not something that can be rushed with 3.20 just around the corner 2024-05-20 05:44:03 Anyway, i'm happy that there is at least some tests that can be run upon SQLite upgrades now, as i've often upgraded SQLite wondering if the upgrade will break something 2024-05-20 08:34:25 Wow, 4 languages having commits one after another 2024-05-20 08:34:55 Anyone want to add a 5th to the list? 2024-05-20 08:56:32 Hopefully the builders are able to finish building NodeJS 2024-05-20 09:04:03 Oh no :( 2024-05-20 09:04:38 2 NodeJS + 1 QtWebEngine proved too much.. 2024-05-20 09:11:00 it can be retried later 2024-05-20 09:13:26 armv7 is the first this time around 2024-05-20 09:14:36 There's your retry 2024-05-20 11:56:39 algitbot: retry master 2024-05-20 14:22:35 algitbot: retry master 2024-05-20 15:09:21 Spent some time chasing down the commit that caused a test failure 2024-05-20 15:09:56 So, paradoxically, was really happy when the failure occurred :) 2024-05-20 15:43:09 algitbot: retry master 2024-05-20 16:03:23 algitbot: retry master 2024-05-20 16:13:34 cppcheck has finally passed 2024-05-20 16:13:43 deno has also been uploaded on 3.20 x86_64, now only edge remains 2024-05-20 19:05:58 rip util-linux 2024-05-20 19:06:46 algitbot: retry master 2024-05-20 19:20:04 seems to be flaky 2024-05-20 19:20:05 algitbot: retry master 2024-05-20 20:54:40 algitbot: retry 3.18-stable 2024-05-20 21:27:41 algitbot: retry 3.17-stable 2024-05-21 02:04:06 Ugh 2024-05-21 02:04:25 So, something changed in the 3 days since !66090 was opened 2024-05-21 02:08:29 util-linux gained a new dmesg subpackage, and i took over maintainership of sqlite 2024-05-21 02:09:22 I'm betting on the former 2024-05-21 02:14:35 No, that's not it 2024-05-21 02:15:29 So much for wanting to close a long standing MR that's been assigned to me for review (!54727) 2024-05-21 02:20:21 ACTION remembers back to the day before 3.19 when NodeJS errors started appearing 2024-05-21 02:20:33 Seems like this time around, i've hit the jackpot 2024-05-21 02:27:20 Bingo 2024-05-21 02:27:29 The Busybox CVE fix commit is causing this problem 2024-05-21 02:28:56 Downgrading to busybox=1.36.1-r26 allows the lttng-ust tests to pass 2024-05-21 03:00:01 Phew 2024-05-21 06:53:02 OOM killer strikes again? 2024-05-21 06:55:53 is it possible to try them one at a time? 2024-05-21 06:56:35 Probably not 2024-05-21 06:57:10 but allowing armv7 and armhf to complete first should be of some help 2024-05-21 06:58:11 (all 3 archs run on the same host machine, so edge+3.20 makes that 6 copies being built at once) 2024-05-21 07:01:39 32-bit ARMs are done with Clang, so should be safe to retry aarch64 now 2024-05-21 07:01:42 algitbot: retry master 2024-05-21 07:18:39 Finally done with Clang 18 :) 2024-05-21 16:33:04 algitbot: retry master 2024-05-22 06:00:11 algitbot: retry 3.17-stable 2024-05-22 06:00:55 algitbot: retry 3.16-stable 2024-05-22 08:58:19 o 2024-05-22 09:47:59 \o/ 2024-05-22 16:30:22 algitbot: retry master 2024-05-22 16:37:07 algitbot: retry master 2024-05-22 16:37:41 algitbot: retry master 2024-05-22 16:53:49 algitbot: retry 3.20-stable 2024-05-22 21:08:58 wew. Now that 3.20 is out, I'd like to see about tackling updating the LXQt packages... 2024-05-22 21:10:35 Biggest hurdle right now is libdbusmenu-lxqt, which would be a new package... It's a qt6 fork of libdbusmenu-qt. I'm wondering about the possibility of sending it straight to community and bypassing testing. 2024-05-22 21:11:26 https://git.alpinelinux.org/aports/tree/community/libdbusmenu-qt/APKBUILD 2024-05-22 21:11:28 https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/65828/diffs?commit_id=62866fac6edb4a1dbb7e07fd648baa31555db2ac 2024-05-22 21:13:51 it's a dependency for the 2.x/qt6 versions of LXQt stage IV packages (i.e. these: https://github.com/lxqt/lxqt/wiki/Building-from-source#iv---all-remaining-packages-in-any-order ) 2024-05-22 21:20:30 crap I meant to post all that in -devel. .... #fail 2024-05-23 02:24:38 algitbot: retry master 2024-05-23 07:55:58 algitbot: retry master 2024-05-23 07:56:16 algitbot: retry 3.20-stable 2024-05-23 07:57:04 PureTryOut: it looks like they're hitting a broken mirror? 2024-05-23 07:57:08 Why... Does it fail to retrieve sources on x86 architectures only? 2024-05-23 07:57:18 Seems like it, but only on x86_64 and x86 🤔 2024-05-23 07:57:26 probably location-based 2024-05-23 07:57:36 Ah yeah that sounds right 2024-05-23 09:43:58 algitbot: retry 3.20-stable 2024-05-23 10:59:13 okay um 2024-05-23 10:59:58 nobody saw anything 2024-05-23 11:04:06 👀 2024-05-23 11:04:40 first try 2024-05-23 11:53:05 algitbot: retry master 2024-05-23 13:42:09 algitbot: retry 3.20-stable 2024-05-23 14:13:42 algitbot: retry master 2024-05-23 16:22:22 Trusting the test suite :) 2024-05-23 19:08:42 algitbot: retry master 2024-05-23 19:40:56 algitbot: retry master 2024-05-23 20:33:57 algitbot: retry master 2024-05-23 20:34:12 * algitbot: retry master 2024-05-23 21:00:13 I think riscv64 is stuck, it's been working on community/xdg-desktop-portal-kde for way too long now... 2024-05-23 21:01:37 algitbot: retry master 2024-05-23 21:36:08 PureTryOut: there are network issues 2024-05-23 21:40:00 I wonder if that contributed to my job#1396450 failing 2024-05-24 07:13:39 algitbot: retry master 2024-05-24 07:24:18 algitbot: retry master 2024-05-24 10:46:59 algitbot: retry master 2024-05-24 12:39:22 algitbot: retry master 2024-05-24 12:41:13 algitbot: retry master 2024-05-24 12:48:06 algitbot: retry master 2024-05-24 12:58:55 algitbot: retry master 2024-05-24 13:12:45 algitbot: retry master 2024-05-24 13:14:10 algitbot: retry master 2024-05-24 15:17:00 algitbot: retry master 2024-05-24 15:17:10 algitbot: retry 3.20-stable 2024-05-24 16:36:14 algitbot: retry master 2024-05-24 16:36:59 algitbot: retry master 2024-05-24 16:37:17 Ugh 2024-05-24 16:37:23 Why won't it build something else.. 2024-05-24 16:37:25 algitbot: retry master 2024-05-24 16:38:06 (i just saw !66491 being closed, so not sure if s390x works for those packages or not) 2024-05-24 16:38:33 algitbot: retry master 2024-05-24 16:39:34 So x86_64 finally finished building Chromium! 2024-05-24 16:39:37 cely, I made an MR to fix this: 66498 2024-05-24 16:39:39 *!66498 2024-05-24 16:39:54 the title is a bit misleading because it largely disables s390x 2024-05-24 16:40:05 Welcome back from the Chromium abyss, build-edge-x86_64 2024-05-24 16:40:21 cely, aaah I see, so now it finally can catch up 2024-05-24 16:58:48 I'd be very happy if it could start skipping this one for now and just build its dependencies 2024-05-24 16:59:42 It should build dependencies first, if it does not, it means there is some issue, either a circular dependency, or a hidden dependency 2024-05-24 17:00:09 I think I got it wrong 2024-05-24 17:00:42 What dependencies are you talking about? 2024-05-24 17:00:43 we need to take it out for now 2024-05-24 17:00:48 The dependencies are built 2024-05-24 17:00:52 Just not uploaded 2024-05-24 17:01:28 Each repository, main community testing, has to be completely unblocked for whatever's been built in there to be uploaded 2024-05-24 17:01:33 the reason is that the fix for this issue is in here: https://gitlab.alpinelinux.org/alpine/aports/-/merge_requests/66498/diffs?commit_id=93ea6707f04c3ee7e523fd33b73a3447f3209c9f#cd14eea7b21e9c0621c58a110a26238e0a9e6945_7_6 2024-05-24 17:01:37 but that does not matter for the builder 2024-05-24 17:01:40 the builder has all packages locally 2024-05-24 17:02:30 and for that package's pipeliens to succeed we need this MR first: !65923 2024-05-24 17:02:52 ^ fluix: i hope you don't mind that i just got that commit in (trying to beat the Gitlab shutdown) 2024-05-24 17:02:54 and that one's pipelines fail because py3-fastapi is not built for s390x yet 2024-05-24 17:05:17 we need to take grommunio-admin-api out, it seems to be stuck on trying to build it 2024-05-24 17:05:43 (at least I think so) 2024-05-24 17:05:53 cely: totally fine, thanks for doing so much :D 2024-05-24 17:06:15 Thermi: you don't need to necessary remove it to fix it, just push a commit that fixes it and the builder should pick that up the next time 2024-05-24 17:06:22 I see 2024-05-24 17:06:44 As long as dependencies are marked correctly, the builder first builds the dependency before building the package itself 2024-05-24 17:07:28 But CI pipelines like cely mentioned don't see the already built but not uploaded packages yet 2024-05-24 17:09:08 If you want to know if the pipeline will succeed or not, you can always bump pkgrel on those aports and add them to the MR, then remove them after you've verified that CI is green 2024-05-24 17:09:29 indeed 2024-05-24 17:09:50 Any change the the APKBUILD would trigger a rebuild btw 2024-05-24 17:09:54 even a whitespace change 2024-05-24 17:10:06 (in CI) 2024-05-24 17:10:23 But adding a trailing whitespace would make lint unhappy ;) 2024-05-24 17:10:41 Could be a reminder to remove that before merging though 2024-05-25 01:04:34 algitbot: retry master 2024-05-25 01:05:05 algitbot: retry master 2024-05-25 01:29:06 algitbot: retry master 2024-05-25 07:14:00 ERROR: plasma-workspace-wallpapers-6.0.5-r0.apk: UNTRUSTED signature 2024-05-25 07:14:02 >>> ERROR: kinfocenter: Failed to create index 2024-05-25 07:14:04 huh 2024-05-25 07:14:30 Maybe r0 was used before? 2024-05-25 07:15:43 this is when adding a package to the index 2024-05-25 07:15:58 and the builders do not use dl-cdn 2024-05-25 07:16:05 Ok 2024-05-25 07:17:35 Don't see anything strange here: https://build.alpinelinux.org/buildlogs/build-3-20-riscv64/community/plasma-workspace-wallpapers/plasma-workspace-wallpapers-6.0.4-r0.log 2024-05-25 07:18:13 It's complaining about 6.0.5 though 2024-05-25 07:18:18 Yes, just noticed 2024-05-25 07:19:09 ok, a bunch of null bytes in the buildlog for 6.0.5, so I guess something crashed when that was built 2024-05-25 07:19:22 I'll delete the package so that it gets built again 2024-05-25 07:19:58 algitbot: retry 3.20-stable 2024-05-25 13:54:22 algitbot: retry master 2024-05-25 14:23:14 algitbot: retry master 2024-05-25 15:54:45 algitbot: retry master 2024-05-25 16:02:16 algitbot: retry master 2024-05-25 16:14:12 algitbot: retry master 2024-05-25 16:27:20 algitbot: retry master 2024-05-25 16:45:54 algitbot: retry master 2024-05-25 17:05:10 algitbot: retry master 2024-05-25 17:24:17 algitbot: retry master 2024-05-25 17:54:46 ... 2024-05-25 17:54:51 haha 2024-05-25 17:57:43 algitbot: retry master 2024-05-25 18:10:47 algitbot: retry master 2024-05-25 18:20:04 algitbot: retry master 2024-05-25 20:59:13 Failed to start message bus: Error in file /etc/dbus-1/session.conf, line 1, column 0: no element found 2024-05-26 00:46:36 algitbot: retry master 2024-05-26 00:54:35 algitbot: retry master 2024-05-26 03:13:27 algitbot: retry master 2024-05-27 01:51:50 ... 2024-05-27 15:17:18 Hmm, i only saw messages for v3.20/main/aarch64 and armhf being uploaded...is my internet connection playing tricks on me again 2024-05-27 15:25:02 cely: I see messages for all arches 2024-05-27 15:26:35 Now i see it 2024-05-27 15:26:56 Silly internet connection :( 2024-05-28 11:03:38 PureTryOut: ^ 2024-05-28 11:03:58 but it succeeded on CI 😢 2024-05-28 11:04:36 algitbot: retry master 2024-05-28 11:06:19 yeah, it's quite odd tbh 2024-05-28 11:06:24 maybe just a one-off error 2024-05-28 11:06:41 race condition? 2024-05-28 11:11:55 indeed a one-off issue 🙃 2024-05-30 09:17:34 algitbot: retry master 2024-05-30 17:30:44 algitbot: retry master 2024-05-30 17:41:17 algitbot: retry master 2024-05-30 19:36:18 algitbot: retry master 2024-05-31 10:50:19 ^ is also a rename of the aport 2024-05-31 10:52:30 right, though, only in testing 2024-05-31 10:52:35 Hmm, but the reason ("required a rename of the pkg directory") makes me think setting `builddir` could have been used instead