build(nix): refactor nix stuff completely #1096

Open
Aviac wants to merge 6 commits from Aviac/continuwuity:flake-parts-isation into main
Member

We keep the old code in the nix subdirectory for good meassure.

Currently, this allows users to build:

  • continuwuity with default features
  • continuwuity with all (sensible) features enabled

This doesn't include all the cross compilation builds so

  • NO musl builds (yet)
  • NO aarch builds (yet)

they didn't work beforehand anyways so 🤷 This also gave me the change to simplify a lot of the code massively. I'm sure we can add back cross compilation once there is a bit more of demand. For now this here should give us a new good base to work with.

Verification:

The flake includes tests for both variants of the build as well as nix native checks for a bunch of things like: rustfmt, taplo fmt, nextest, doctest, audits, licenses ...

We keep the old code in the nix subdirectory for good meassure. Currently, this allows users to build: - continuwuity with default features - continuwuity with all (sensible) features enabled This doesn't include all the cross compilation builds so - NO musl builds (yet) - NO aarch builds (yet) they didn't work beforehand anyways so 🤷 This also gave me the change to simplify a lot of the code massively. I'm sure we can add back cross compilation once there is a bit more of demand. For now this here should give us a new good base to work with. Verification: The flake includes tests for both variants of the build as well as nix native checks for a bunch of things like: rustfmt, taplo fmt, nextest, doctest, audits, licenses ...
remove unnecessary arg

also create module for build (unused)

move continuwuity code and use new uwulib.build code

exactly replicate environment code as in legacy nix code
simplify
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m22s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 6m20s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 6m29s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 7m40s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 4s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 12m29s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 12m26s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 4s
ded5b964ba
nex added this to the 0.5.0 milestone 2025-10-02 14:50:12 +00:00
Contributor

I think with the move of the hashes it's now a good idea to look into running https://github.com/Mic92/nix-update/ in CI as a replacement of my workflow.

I think with the move of the hashes it's now a good idea to look into running https://github.com/Mic92/nix-update/ in CI as a replacement of my workflow.
Owner

Is there a particular reason to keep the old code around? It'll still be in the git history

Is there a particular reason to keep the old code around? It'll still be in the git history
Author
Member

Is there a particular reason to keep the old code around? It'll still be in the git history

Yeah ok, makes sense. I just used it while developing to mirror the code for convenience. I'll remove it 👍

> Is there a particular reason to keep the old code around? It'll still be in the git history Yeah ok, makes sense. I just used it while developing to mirror the code for convenience. I'll remove it 👍
remove legacy code 😢👋
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m49s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 6m32s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 6m42s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 8m12s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 4s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 12m54s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 13m1s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 6s
8d6f11052c
Author
Member

@Shuroii I tried the following:

nix run nixpkgs#nix-update -- rocksdb --flake --version v10.5.fb

and got the following diff

diff --git a/nix/packages/rocksdb.nix b/nix/packages/rocksdb.nix
index feb0806d..103aa79b 100644
--- a/nix/packages/rocksdb.nix
+++ b/nix/packages/rocksdb.nix
@@ -21,9 +21,9 @@
                 owner = "continuwuation";
                 repo = "rocksdb";
                 rev = "10.5.fb";
-                sha256 = "sha256-X4ApGLkHF9ceBtBg77dimEpu720I79ffLoyPa8JMHaU=";
+                sha256 = "path: /nix/store/v50ynrqg43z4jpi6ld7hhhv0fc42aixv-source";
               };
-              version = "v10.5.fb";
+              version = "10.5.fb";
               cmakeFlags =
                 lib.subtractLists [
                   # No real reason to have snappy or zlib, no one uses this

I'm very confused ^^'

The path holds teh rocksdb source code.

@Shuroii I tried the following: ``` nix run nixpkgs#nix-update -- rocksdb --flake --version v10.5.fb ``` and got the following diff ``` diff --git a/nix/packages/rocksdb.nix b/nix/packages/rocksdb.nix index feb0806d..103aa79b 100644 --- a/nix/packages/rocksdb.nix +++ b/nix/packages/rocksdb.nix @@ -21,9 +21,9 @@ owner = "continuwuation"; repo = "rocksdb"; rev = "10.5.fb"; - sha256 = "sha256-X4ApGLkHF9ceBtBg77dimEpu720I79ffLoyPa8JMHaU="; + sha256 = "path: /nix/store/v50ynrqg43z4jpi6ld7hhhv0fc42aixv-source"; }; - version = "v10.5.fb"; + version = "10.5.fb"; cmakeFlags = lib.subtractLists [ # No real reason to have snappy or zlib, no one uses this ``` I'm very confused ^^' The path holds teh rocksdb source code.
savyajha requested changes 2025-10-03 09:18:22 +00:00
Dismissed
savyajha left a comment
Contributor

A few minor nits and questions (I'm not familiar with crane, so cannot comment there), but more around rocksdb and liburing.

A few minor nits and questions (I'm not familiar with crane, so cannot comment there), but more around rocksdb and liburing.
@ -0,0 +10,4 @@
# own. In order for this to work, we need to set flags on the build that match
# whatever flags tikv-jemalloc-sys was going to use. These are dependent on
# which features we enable in tikv-jemalloc-sys.
packages.rust-jemalloc-sys' =
Contributor

You can just use packages.rust-jemalloc-sys-unprefixed from upstream nixpkgs, which should work properly. If using the upstream package, might not be worth it to disable c++ or docs.

You can just use `packages.rust-jemalloc-sys-unprefixed` from upstream nixpkgs, which should work properly. If using the upstream package, might not be worth it to disable c++ or docs.
Author
Member

My first try naively replacing the package ran into this here:

error: builder for '/nix/store/xmabk06d963m6kwvfkc0f16w7nfxzyn0-conduwuit-0.5.0-rc.8.drv' failed with exit code 101;
       last 25 log lines:
       >    Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main)
       >     Finished `release` profile [optimized] target(s) in 1m 24s
       > searching for bins/libs to install from cargo build log at cargoBuildLogo3xy.json
       > installing /build/source/target/release/conduwuit in postBuildInstallFromCargoBuildLogOutTempXK7/bin
       > searching for bins/libs complete
       > buildPhase completed in 1 minutes 25 seconds
       > Running phase: checkPhase
       > +++ command cargo test --release --no-default-features --locked --features bindgen-runtime,blurhashing,brotli_compression,console,direct_tls,element_hacks,full,gzip_compression,io_uring,jemalloc,jemalloc_conf
,journald,ldap,media_thumbnail,perf_measurements,release_max_log_level,sentry_telemetry,standard,systemd,tokio_console,url_preview,zstd_compression
       >    Compiling conduwuit_core v0.5.0-rc.8 (/build/source/src/core)
       >    Compiling conduwuit_build_metadata v0.5.0-rc.8 (/build/source/src/build_metadata)
       >    Compiling conduwuit_macros v0.5.0-rc.8 (/build/source/src/macros)
       >    Compiling conduwuit_database v0.5.0-rc.8 (/build/source/src/database)
       >    Compiling conduwuit_service v0.5.0-rc.8 (/build/source/src/service)
       >    Compiling conduwuit_api v0.5.0-rc.8 (/build/source/src/api)
       >    Compiling conduwuit_web v0.5.0-rc.8 (/build/source/src/web)
       >    Compiling conduwuit_admin v0.5.0-rc.8 (/build/source/src/admin)
       >    Compiling conduwuit_router v0.5.0-rc.8 (/build/source/src/router)
       >    Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main)
       >     Finished `release` profile [optimized] target(s) in 1m 09s
       >      Running unittests mod.rs (target/release/deps/conduwuit-67fbd204f38e8c35)
       > <jemalloc>: Invalid conf pair: prof_active:false
       > error: test failed, to rerun pass `-p conduwuit --lib`
       >
       > Caused by:
       >   process didn't exit successfully: `/build/source/target/release/deps/conduwuit-67fbd204f38e8c35` (signal: 11, SIGSEGV: invalid memory reference)
       For full logs, run:
              nix log /nix/store/xmabk06d963m6kwvfkc0f16w7nfxzyn0-conduwuit-0.5.0-rc.8.drv

I'll try further

My first try naively replacing the package ran into this here: ``` error: builder for '/nix/store/xmabk06d963m6kwvfkc0f16w7nfxzyn0-conduwuit-0.5.0-rc.8.drv' failed with exit code 101; last 25 log lines: > Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main) > Finished `release` profile [optimized] target(s) in 1m 24s > searching for bins/libs to install from cargo build log at cargoBuildLogo3xy.json > installing /build/source/target/release/conduwuit in postBuildInstallFromCargoBuildLogOutTempXK7/bin > searching for bins/libs complete > buildPhase completed in 1 minutes 25 seconds > Running phase: checkPhase > +++ command cargo test --release --no-default-features --locked --features bindgen-runtime,blurhashing,brotli_compression,console,direct_tls,element_hacks,full,gzip_compression,io_uring,jemalloc,jemalloc_conf ,journald,ldap,media_thumbnail,perf_measurements,release_max_log_level,sentry_telemetry,standard,systemd,tokio_console,url_preview,zstd_compression > Compiling conduwuit_core v0.5.0-rc.8 (/build/source/src/core) > Compiling conduwuit_build_metadata v0.5.0-rc.8 (/build/source/src/build_metadata) > Compiling conduwuit_macros v0.5.0-rc.8 (/build/source/src/macros) > Compiling conduwuit_database v0.5.0-rc.8 (/build/source/src/database) > Compiling conduwuit_service v0.5.0-rc.8 (/build/source/src/service) > Compiling conduwuit_api v0.5.0-rc.8 (/build/source/src/api) > Compiling conduwuit_web v0.5.0-rc.8 (/build/source/src/web) > Compiling conduwuit_admin v0.5.0-rc.8 (/build/source/src/admin) > Compiling conduwuit_router v0.5.0-rc.8 (/build/source/src/router) > Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main) > Finished `release` profile [optimized] target(s) in 1m 09s > Running unittests mod.rs (target/release/deps/conduwuit-67fbd204f38e8c35) > <jemalloc>: Invalid conf pair: prof_active:false > error: test failed, to rerun pass `-p conduwuit --lib` > > Caused by: > process didn't exit successfully: `/build/source/target/release/deps/conduwuit-67fbd204f38e8c35` (signal: 11, SIGSEGV: invalid memory reference) For full logs, run: nix log /nix/store/xmabk06d963m6kwvfkc0f16w7nfxzyn0-conduwuit-0.5.0-rc.8.drv ``` I'll try further
Contributor

This usually happens when multiple versions of jemalloc are being used. Can you also check what ldd <final binary without checkPhase> gives you?

This usually happens when multiple versions of jemalloc are being used. Can you also check what `ldd <final binary without checkPhase>` gives you?
Author
Member
linux-vdso.so.1 (0x00007f8f9d240000)
librocksdb.so.10 => /nix/store/8w38ib6qrk9icvsbii0ialhpcx18jph8-rocksdb-10.5.fb/lib/librocksdb.so.10 (0x00007f8f97800000)
libgcc_s.so.1 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libgcc_s.so.1 (0x00007f8f9d20a000)
libm.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libm.so.6 (0x00007f8f97718000)
libc.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libc.so.6 (0x00007f8f97400000)
/nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/ld-linux-x86-64.so.2 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib64/ld-linux-x86-64.so.2 (0x00007f8f9d242000)
libjemalloc.so.2 => /nix/store/4r5h0br31v0wi0ybimd2vcfrljnrq7vn-jemalloc-5.3.0/lib/libjemalloc.so.2 (0x00007f8f97245000)
libbz2.so.1 => /nix/store/x1nn5jrx1im7wmmn1xmpjvyf072s5k75-bzip2-1.0.8/lib/libbz2.so.1 (0x00007f8f9d1f4000)
liblz4.so.1 => /nix/store/dww7q4wmgj9x62fnriarqarb3xcaygj3-lz4-1.10.0-lib/lib/liblz4.so.1 (0x00007f8f9d1b4000)
libzstd.so.1 => /nix/store/bxn7mqdrckhjkmvy5ynhybmj0lmdlgb4-zstd-1.5.7/lib/libzstd.so.1 (0x00007f8f9763f000)
liburing.so.2 => /nix/store/yspagjxal0aviqg0wy91kfm7a3kyf38s-liburing-2.12/lib/liburing.so.2 (0x00007f8f9d1ac000)
libstdc++.so.6 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libstdc++.so.6 (0x00007f8f96e00000)
libpthread.so.0 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libpthread.so.0 (0x00007f8f9d1a5000)
``` linux-vdso.so.1 (0x00007f8f9d240000) librocksdb.so.10 => /nix/store/8w38ib6qrk9icvsbii0ialhpcx18jph8-rocksdb-10.5.fb/lib/librocksdb.so.10 (0x00007f8f97800000) libgcc_s.so.1 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libgcc_s.so.1 (0x00007f8f9d20a000) libm.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libm.so.6 (0x00007f8f97718000) libc.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libc.so.6 (0x00007f8f97400000) /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/ld-linux-x86-64.so.2 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib64/ld-linux-x86-64.so.2 (0x00007f8f9d242000) libjemalloc.so.2 => /nix/store/4r5h0br31v0wi0ybimd2vcfrljnrq7vn-jemalloc-5.3.0/lib/libjemalloc.so.2 (0x00007f8f97245000) libbz2.so.1 => /nix/store/x1nn5jrx1im7wmmn1xmpjvyf072s5k75-bzip2-1.0.8/lib/libbz2.so.1 (0x00007f8f9d1f4000) liblz4.so.1 => /nix/store/dww7q4wmgj9x62fnriarqarb3xcaygj3-lz4-1.10.0-lib/lib/liblz4.so.1 (0x00007f8f9d1b4000) libzstd.so.1 => /nix/store/bxn7mqdrckhjkmvy5ynhybmj0lmdlgb4-zstd-1.5.7/lib/libzstd.so.1 (0x00007f8f9763f000) liburing.so.2 => /nix/store/yspagjxal0aviqg0wy91kfm7a3kyf38s-liburing-2.12/lib/liburing.so.2 (0x00007f8f9d1ac000) libstdc++.so.6 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libstdc++.so.6 (0x00007f8f96e00000) libpthread.so.0 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libpthread.so.0 (0x00007f8f9d1a5000) ```
Author
Member

This comes from the jemalloc_prof feature flag being enabled somehow even though it's in the disabled features list ... I'll look into it

This comes from the `jemalloc_prof` feature flag being enabled somehow even though it's in the disabled features list ... I'll look into it
Contributor

Ideally the output should look like so:

linux-vdso.so.1 (0x00007f38bd381000)
librocksdb.so.10 => /nix/store/sj2gz9lpprrn7qli9am8801xxkim3kz4-rocksdb-10.5.1/lib/librocksdb.so.10 (0x00007f38b80ce000)
libzstd.so.1 => /nix/store/ahw6fr4wihcyq1rbx9ivi4qv6zmn10gg-zstd-1.5.7/lib/libzstd.so.1 (0x00007f38b7ff8000)
libjemalloc.so.2 => /nix/store/4r5h0br31v0wi0ybimd2vcfrljnrq7vn-jemalloc-5.3.0/lib/libjemalloc.so.2 (0x00007f38b7e3d000)
libgcc_s.so.1 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libgcc_s.so.1 (0x00007f38b7e0f000)
libm.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libm.so.6 (0x00007f38b7d25000)
libc.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libc.so.6 (0x00007f38b7b1c000)
/nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/ld-linux-x86-64.so.2 => /nix/store/g8zyryr9cr6540xsyg4avqkwgxpnwj2a-glibc-2.40-66/lib64/ld-linux-x86-64.so.2 (0x00007f38bd383000)
libbz2.so.1 => /nix/store/x1nn5jrx1im7wmmn1xmpjvyf072s5k75-bzip2-1.0.8/lib/libbz2.so.1 (0x00007f38b7b08000)
liblz4.so.1 => /nix/store/dww7q4wmgj9x62fnriarqarb3xcaygj3-lz4-1.10.0-lib/lib/liblz4.so.1 (0x00007f38b7ac8000)
liburing.so.2 => /nix/store/yspagjxal0aviqg0wy91kfm7a3kyf38s-liburing-2.12/lib/liburing.so.2 (0x00007f38b7abe000)
libstdc++.so.6 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libstdc++.so.6 (0x00007f38b784c000)
libpthread.so.0 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libpthread.so.0 (0x00007f38b7847000)

libjemalloc should be fairly high up, ideally above libc. We suffered from a similar problem while upgrading from rc6 -> rc7 in nixpkgs.

Ideally the output should look like so: ```bash linux-vdso.so.1 (0x00007f38bd381000) librocksdb.so.10 => /nix/store/sj2gz9lpprrn7qli9am8801xxkim3kz4-rocksdb-10.5.1/lib/librocksdb.so.10 (0x00007f38b80ce000) libzstd.so.1 => /nix/store/ahw6fr4wihcyq1rbx9ivi4qv6zmn10gg-zstd-1.5.7/lib/libzstd.so.1 (0x00007f38b7ff8000) libjemalloc.so.2 => /nix/store/4r5h0br31v0wi0ybimd2vcfrljnrq7vn-jemalloc-5.3.0/lib/libjemalloc.so.2 (0x00007f38b7e3d000) libgcc_s.so.1 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libgcc_s.so.1 (0x00007f38b7e0f000) libm.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libm.so.6 (0x00007f38b7d25000) libc.so.6 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libc.so.6 (0x00007f38b7b1c000) /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/ld-linux-x86-64.so.2 => /nix/store/g8zyryr9cr6540xsyg4avqkwgxpnwj2a-glibc-2.40-66/lib64/ld-linux-x86-64.so.2 (0x00007f38bd383000) libbz2.so.1 => /nix/store/x1nn5jrx1im7wmmn1xmpjvyf072s5k75-bzip2-1.0.8/lib/libbz2.so.1 (0x00007f38b7b08000) liblz4.so.1 => /nix/store/dww7q4wmgj9x62fnriarqarb3xcaygj3-lz4-1.10.0-lib/lib/liblz4.so.1 (0x00007f38b7ac8000) liburing.so.2 => /nix/store/yspagjxal0aviqg0wy91kfm7a3kyf38s-liburing-2.12/lib/liburing.so.2 (0x00007f38b7abe000) libstdc++.so.6 => /nix/store/41ym1jm1b7j3rhglk82gwg9jml26z1km-gcc-14.3.0-lib/lib/libstdc++.so.6 (0x00007f38b784c000) libpthread.so.0 => /nix/store/776irwlgfb65a782cxmyk61pck460fs9-glibc-2.40-66/lib/libpthread.so.0 (0x00007f38b7847000) ``` libjemalloc should be fairly high up, ideally above libc. We suffered from a similar problem while upgrading from rc6 -> rc7 in nixpkgs.
Author
Member

Ok found it ... jemalloc_prof gets enabled by the full feature set which was enabled on accident!

Ok found it ... `jemalloc_prof` gets enabled by the `full` feature set which was enabled on accident!
Author
Member

Just for the record: jemalloc_prof enabled the config option pair mentioned in this line

       > <jemalloc>: Invalid conf pair: prof_active:false
Just for the record: `jemalloc_prof` enabled the config option pair mentioned in this line ``` > <jemalloc>: Invalid conf pair: prof_active:false ```
Author
Member

Ugg, the warning actually went away but it was unrelated as you thought :(

searching for bins/libs to install from cargo build log at cargoBuildLogLBCI.json
installing /build/source/target/release/conduwuit in postBuildInstallFromCargoBuildLogOutTempjLi/bin
searching for bins/libs complete
buildPhase completed in 1 minutes 28 seconds
Running phase: checkPhase
@nix { "action": "setPhase", "phase": "checkPhase" }
+++ command cargo test --release --no-default-features --locked --features bindgen-runtime,blurhashing,brotli_compression,console,direct_tls,element_hacks,gzip_compression,io_uring,jemalloc,jemalloc_conf,journald,lda>
   Compiling conduwuit_core v0.5.0-rc.8 (/build/source/src/core)
   Compiling conduwuit_build_metadata v0.5.0-rc.8 (/build/source/src/build_metadata)
   Compiling conduwuit_macros v0.5.0-rc.8 (/build/source/src/macros)
   Compiling conduwuit_database v0.5.0-rc.8 (/build/source/src/database)
   Compiling conduwuit_service v0.5.0-rc.8 (/build/source/src/service)
   Compiling conduwuit_api v0.5.0-rc.8 (/build/source/src/api)
   Compiling conduwuit_web v0.5.0-rc.8 (/build/source/src/web)
   Compiling conduwuit_admin v0.5.0-rc.8 (/build/source/src/admin)
   Compiling conduwuit_router v0.5.0-rc.8 (/build/source/src/router)
   Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main)
    Finished `release` profile [optimized] target(s) in 1m 10s
     Running unittests mod.rs (target/release/deps/conduwuit-2f0df0f69403d892)
error: test failed, to rerun pass `-p conduwuit --lib`

Caused by:
  process didn't exit successfully: `/build/source/target/release/deps/conduwuit-2f0df0f69403d892` (signal: 11, SIGSEGV: invalid memory reference)
Ugg, the warning actually went away but it was unrelated as you thought :( ``` searching for bins/libs to install from cargo build log at cargoBuildLogLBCI.json installing /build/source/target/release/conduwuit in postBuildInstallFromCargoBuildLogOutTempjLi/bin searching for bins/libs complete buildPhase completed in 1 minutes 28 seconds Running phase: checkPhase @nix { "action": "setPhase", "phase": "checkPhase" } +++ command cargo test --release --no-default-features --locked --features bindgen-runtime,blurhashing,brotli_compression,console,direct_tls,element_hacks,gzip_compression,io_uring,jemalloc,jemalloc_conf,journald,lda> Compiling conduwuit_core v0.5.0-rc.8 (/build/source/src/core) Compiling conduwuit_build_metadata v0.5.0-rc.8 (/build/source/src/build_metadata) Compiling conduwuit_macros v0.5.0-rc.8 (/build/source/src/macros) Compiling conduwuit_database v0.5.0-rc.8 (/build/source/src/database) Compiling conduwuit_service v0.5.0-rc.8 (/build/source/src/service) Compiling conduwuit_api v0.5.0-rc.8 (/build/source/src/api) Compiling conduwuit_web v0.5.0-rc.8 (/build/source/src/web) Compiling conduwuit_admin v0.5.0-rc.8 (/build/source/src/admin) Compiling conduwuit_router v0.5.0-rc.8 (/build/source/src/router) Compiling conduwuit v0.5.0-rc.8 (/build/source/src/main) Finished `release` profile [optimized] target(s) in 1m 10s Running unittests mod.rs (target/release/deps/conduwuit-2f0df0f69403d892) error: test failed, to rerun pass `-p conduwuit --lib` Caused by: process didn't exit successfully: `/build/source/target/release/deps/conduwuit-2f0df0f69403d892` (signal: 11, SIGSEGV: invalid memory reference) ```
Author
Member

Could this be related?

conduwuit> Running phase: updateAutotoolsGnuConfigScriptsPhase
conduwuit> Updating Autotools / GNU config script to a newer upstream version: ./target/release/build/tikv-jemalloc-sys-8b6728363723bb21/out/build/build-aux/config.sub
conduwuit> Updating Autotools / GNU config script to a newer upstream version: ./target/release/build/tikv-jemalloc-sys-8b6728363723bb21/out/build/build-aux/config.guess
Could this be related? ``` conduwuit> Running phase: updateAutotoolsGnuConfigScriptsPhase conduwuit> Updating Autotools / GNU config script to a newer upstream version: ./target/release/build/tikv-jemalloc-sys-8b6728363723bb21/out/build/build-aux/config.sub conduwuit> Updating Autotools / GNU config script to a newer upstream version: ./target/release/build/tikv-jemalloc-sys-8b6728363723bb21/out/build/build-aux/config.guess ```
Contributor

It might, it might not. The bottom line is that the linking is not happening properly. Was the package building before your changes? I'd recommend removing the customisations you have made to jemalloc and liburing and building after that.

It might, it might not. The bottom line is that the linking is not happening properly. Was the package building before your changes? I'd recommend removing the customisations you have made to jemalloc and liburing and building after that.
Author
Member

Idk, I can't get it to work and I don't want to spend more of my sunday on it for now. It seems like the cxx upstream feature just creates some kind of trouble ... otherwise it wouldn't succeed without it. I tried a lot of variants now but I'm getting tired of waiting 2min for the compilation :p

I'll come back once I get some motivation again. Maybe we can still merge this in the mean time and fix it some other day?

Idk, I can't get it to work and I don't want to spend more of my sunday on it for now. It seems like the cxx upstream feature just creates some kind of trouble ... otherwise it wouldn't succeed without it. I tried a lot of variants now but I'm getting tired of waiting 2min for the compilation :p I'll come back once I get some motivation again. Maybe we can still merge this in the mean time and fix it some other day?
Author
Member

Was the package building before your changes?

Yes, with no problem. It also passes the NixOS test

> Was the package building before your changes? Yes, with no problem. It also passes the NixOS test
Author
Member

My personal guess is that in the unprefixed version there is a clash between the cpp and rust symbols. I don't know how it works upstream then though

My personal guess is that in the unprefixed version there is a clash between the cpp and rust symbols. I don't know how it works upstream then though
Aviac marked this conversation as resolved
@ -0,0 +6,4 @@
...
}:
{
packages.liburing = pkgs.liburing.overrideAttrs (prev: {
Contributor

Why not just use upstream to reduce build time?

Why not just use upstream to reduce build time?
Author
Member

It's honestly not the part that takes longest in the build so I didn't really care yet ^^ I think this takes like sub 10 sec

It's honestly not the part that takes longest in the build so I didn't really care yet ^^ I think this takes like sub 10 sec
Aviac marked this conversation as resolved
@ -0,0 +9,4 @@
{
packages = {
rocksdbBase =
(pkgs.rocksdb_9_10.override {
Contributor

pkgs.rocksdb.override will also work.

`pkgs.rocksdb.override` will also work.
Aviac marked this conversation as resolved
@ -0,0 +13,4 @@
# Override the liburing input for the build with our own so
# we have it built with the library flag
liburing = self'.packages.liburing;
jemalloc = self'.packages.rust-jemalloc-sys';
Contributor

jemalloc = pkgs.rust-jemalloc-sys-unprefixed if you're okay using upstream.

`jemalloc = pkgs.rust-jemalloc-sys-unprefixed` if you're okay using upstream.
Aviac marked this conversation as resolved
@ -0,0 +27,4 @@
cmakeFlags =
lib.subtractLists [
# No real reason to have snappy or zlib, no one uses this
"-DWITH_SNAPPY=1"
Contributor

Ideally use (lib.cmakeBool "WITH_SNAPPY" true) instead of -DWITH_SNAPPY=1, imo. That's nixpkgs best practices.

Ideally use `(lib.cmakeBool "WITH_SNAPPY" true)` instead of `-DWITH_SNAPPY=1`, imo. That's nixpkgs best practices.
Aviac marked this conversation as resolved
@ -0,0 +65,4 @@
# Unsetting this so we don't have to revert it and make this nix exclusive
patches = [ ];
postPatch = ''
Contributor

We don't need these, the upstream package handles these well enough.

We don't need these, the upstream package handles these well enough.
Author
Member

Unfortunately it was broken when I tried to remove them. I will try again though with the rest of your suggestions!

Unfortunately it was broken when I tried to remove them. I will try again though with the rest of your suggestions!
Author
Member
error: builder for '/nix/store/02qnvj931hyka10g8n3p9v137isw2ims-rocksdb-v10.5.fb.drv' failed with exit code 2;
       last 5 log lines:
       > Running phase: unpackPhase
       > unpacking source archive /nix/store/v50ynrqg43z4jpi6ld7hhhv0fc42aixv-source
       > source root is source
       > Running phase: patchPhase
       > sed: can't read third-party/folly/folly/synchronization/detail/ProxyLockable-inl.h: No such file or directory
       For full logs, run:
           nix log /nix/store/02qnvj931hyka10g8n3p9v137isw2ims-rocksdb-v10.5.fb.drv
``` error: builder for '/nix/store/02qnvj931hyka10g8n3p9v137isw2ims-rocksdb-v10.5.fb.drv' failed with exit code 2; last 5 log lines: > Running phase: unpackPhase > unpacking source archive /nix/store/v50ynrqg43z4jpi6ld7hhhv0fc42aixv-source > source root is source > Running phase: patchPhase > sed: can't read third-party/folly/folly/synchronization/detail/ProxyLockable-inl.h: No such file or directory For full logs, run: nix log /nix/store/02qnvj931hyka10g8n3p9v137isw2ims-rocksdb-v10.5.fb.drv ```
Author
Member

I guess those patches from upstream in postPatch should also be completely disabled since we did them in our fork


  postPatch =
    lib.optionalString (lib.versionOlder finalAttrs.version "8") ''
      # Fix gcc-13 build failures due to missing <cstdint> and
      # <system_error> includes, fixed upstyream sice 8.x
      sed -e '1i #include <cstdint>' -i db/compaction/compaction_iteration_stats.h
      sed -e '1i #include <cstdint>' -i table/block_based/data_block_hash_index.h
      sed -e '1i #include <cstdint>' -i util/string_util.h
      sed -e '1i #include <cstdint>' -i include/rocksdb/utilities/checkpoint.h
    ''
    + lib.optionalString (lib.versionOlder finalAttrs.version "7") ''
      # Fix gcc-13 build failures due to missing <cstdint> and
      # <system_error> includes, fixed upstyream sice 7.x
      sed -e '1i #include <system_error>' -i third-party/folly/folly/synchronization/detail/ProxyLockable-inl.h
    '';
I guess those patches from upstream in postPatch should also be completely disabled since we did them in our fork ``` postPatch = lib.optionalString (lib.versionOlder finalAttrs.version "8") '' # Fix gcc-13 build failures due to missing <cstdint> and # <system_error> includes, fixed upstyream sice 8.x sed -e '1i #include <cstdint>' -i db/compaction/compaction_iteration_stats.h sed -e '1i #include <cstdint>' -i table/block_based/data_block_hash_index.h sed -e '1i #include <cstdint>' -i util/string_util.h sed -e '1i #include <cstdint>' -i include/rocksdb/utilities/checkpoint.h '' + lib.optionalString (lib.versionOlder finalAttrs.version "7") '' # Fix gcc-13 build failures due to missing <cstdint> and # <system_error> includes, fixed upstyream sice 7.x sed -e '1i #include <system_error>' -i third-party/folly/folly/synchronization/detail/ProxyLockable-inl.h ''; ```
Contributor

Basically you can just omit postPatch altogether. Upstream postPatch is fine. However, keep patches empty.

Basically you can just omit `postPatch` altogether. Upstream `postPatch` is fine. However, keep `patches` empty.
Author
Member

I think what might be happening here is that upstreams

(lib.versionOlder finalAttrs.version "7")

is confused by us overwriting the version to something not semver, hence it executes the optional stuff which is non-sense for our version

I think what might be happening here is that upstreams `(lib.versionOlder finalAttrs.version "7")` is confused by us overwriting the version to something not semver, hence it executes the optional stuff which is non-sense for our version
Author
Member

The thing is, that folly doesnt exist in our rocksdb version:

https://forgejo.ellis.link/continuwuation/rocksdb/src/branch/10.5.fb/third-party

The thing is, that `folly` doesnt exist in our rocksdb version: https://forgejo.ellis.link/continuwuation/rocksdb/src/branch/10.5.fb/third-party
Author
Member

I'm confused, upstream rocksdb also doesn't have it in tree. Do we need to fetch this? How does upstream nix handle this?

https://github.com/facebook/rocksdb/tree/main/third-party

I'm confused, upstream rocksdb also doesn't have it in tree. Do we need to fetch this? How does upstream nix handle this? https://github.com/facebook/rocksdb/tree/main/third-party
Author
Member

argggg. I set version = "v10.5.fb" when it should've been version = "10.5.fb" ._.

(this way lib.versionOlder returns the correct result)

Anyways, this is solved then. Thanks again!

argggg. I set `version = "v10.5.fb"` when it should've been `version = "10.5.fb"` ._. (this way `lib.versionOlder` returns the correct result) Anyways, this is solved then. Thanks again!
Aviac marked this conversation as resolved
@ -0,0 +75,4 @@
'';
});
rocksdb =
Contributor

Instead of creating multiple rocksdb variant packages, why not create a single one where you can override liburing/jemalloc using bools? That's how it's done upstream. I think it might be a cleaner solution and reduce a few loc.

Instead of creating multiple rocksdb variant packages, why not create a single one where you can override liburing/jemalloc using bools? That's how it's done [upstream](https://github.com/NixOS/nixpkgs/blob/89c0fd3272c90aaa237c6f2a1cb0cc5741d2b6ca/pkgs/by-name/ma/matrix-continuwuity/package.nix#L15). I think it might be a cleaner solution and reduce a few loc.
Aviac marked this conversation as resolved
Author
Member

@savyajha Thanks for the review! That's exactly what was missing, very good stuff and I think this simplified a few things really nicely 💯

@savyajha Thanks for the review! That's exactly what was missing, very good stuff and I think this simplified a few things really nicely 💯
restructure rocksdb package and make it override-able
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m54s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 6m52s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 6m57s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 6s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 7m52s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 12m42s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 12m47s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 4s
d4f6dc9ad3
better source filter to prevent needless recompiles
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m39s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 6m30s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 6m39s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 6s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 7m55s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 13m31s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 13m36s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 11s
4dac1fe739
Contributor

I'd recommend just using unchanged upstream rust-jemalloc-sys-unprefixed and liburing to avoid unnecessary recompilations here. You don't need to redefine the packages, just use the nixpkgs versions. The changes being made aren't major.

I'd recommend just using unchanged upstream `rust-jemalloc-sys-unprefixed` and `liburing` to avoid unnecessary recompilations here. You don't need to redefine the packages, just use the nixpkgs versions. The changes being made aren't major.
Owner

I haven't reviewed the jemalloc changes, but I'm not aware of anything that would prevent upstream. Liburing should auready be upstream? We nornally use distro packages for that iirc.

I haven't reviewed the jemalloc changes, but I'm not aware of anything that would prevent upstream. Liburing should auready be upstream? We nornally use distro packages for that iirc.
this added `jemalloc_prof` through transitivity and caused a warning
about an invalid jemalloc conf option
This caused issues with the package since some upstream conditional
check was applied since apparently v10.5 < 7 through `lib.versionOlder`
... :o
cleanup after using upstream liburing
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m22s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 5m38s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 5m45s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 5s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 6m12s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 11m35s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 11m44s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 4s
ee47a5d7ab
@ -0,0 +63,4 @@
doCheck = true;
nativeBuildInputs = [
# bindgen needs the build platform's libclang. Apparently due to "splicing
Contributor

Should this not be self'.rocksdb?

Should this not be `self'.rocksdb`?
Author
Member

It's a bit complex. This is a function that's called in other functions like buildPackage and buildDeps (below). These functions are called in nix/packages/continuwuity/default.nix with the right version of rocksdb depending on the feature set, like so:

              rocksdb = self'.packages.rocksdb;
              ...
              # blah
              ...
              rocksdb = self'.packages.rocksdb.override {
                enableJemalloc = true;
                enableLiburing = true;
              };
It's a bit complex. This is a function that's called in other functions like `buildPackage` and `buildDeps` (below). These functions are called in `nix/packages/continuwuity/default.nix` with the right version of rocksdb depending on the feature set, like so: ``` rocksdb = self'.packages.rocksdb; ... # blah ... rocksdb = self'.packages.rocksdb.override { enableJemalloc = true; enableLiburing = true; }; ```
Aviac marked this conversation as resolved
Author
Member

@savyajha Any further thoughts?

@savyajha Any further thoughts?
Contributor

@Aviac I'd still prefer it if we used upstream rust-jemalloc-sys - it seems to work perfectly in nixpkgs as well as in the previous iteration. I would like to go through this and debug this thoroughly but I've got too much on my plate IRL atm. Apart from that I think this is a very well-thought out implementation and I have learned a lot going through your code. :)

@Aviac I'd still prefer it if we used upstream rust-jemalloc-sys - it seems to work perfectly in nixpkgs as well as in the previous iteration. I would like to go through this and debug this thoroughly but I've got too much on my plate IRL atm. Apart from that I think this is a very well-thought out implementation and I have learned a lot going through your code. :)
Owner

Are we good to merge in this state then?

Are we good to merge in this state then?
Author
Member

@savyajha wrote in #1096 (comment):

@Aviac I'd still prefer it if we used upstream rust-jemalloc-sys - it seems to work perfectly in nixpkgs as well as in the previous iteration. I would like to go through this and debug this thoroughly but I've got too much on my plate IRL atm. Apart from that I think this is a very well-thought out implementation and I have learned a lot going through your code. :)

Yeah me too. Can we do that as a follow up? Are there any open issues from your side aside from that?

@savyajha wrote in https://forgejo.ellis.link/continuwuation/continuwuity/pulls/1096#issuecomment-20945: > @Aviac I'd still prefer it if we used upstream rust-jemalloc-sys - it seems to work perfectly in nixpkgs as well as in the previous iteration. I would like to go through this and debug this thoroughly but I've got too much on my plate IRL atm. Apart from that I think this is a very well-thought out implementation and I have learned a lot going through your code. :) Yeah me too. Can we do that as a follow up? Are there any open issues from your side aside from that?
Contributor

No other comments from me. Looks good to go!

No other comments from me. Looks good to go!
savyajha approved these changes 2025-10-13 17:06:17 +00:00
Owner

Needs a rebase before we merge because of the lockfile conflict

Needs a rebase before we merge because of the lockfile conflict
Aviac force-pushed flake-parts-isation from ee47a5d7ab
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 1m22s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 5m38s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 5m45s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 5s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 6m12s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 11m35s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 11m44s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 4s
to f715f16773
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m9s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 9m48s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 9m43s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 7s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 12m40s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 18m39s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 18m49s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 13s
2025-10-16 06:47:35 +00:00
Compare
feat: add hydra jobs to build all packages
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m27s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 7m20s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 8m21s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 8s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 11m58s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 18m9s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 18m13s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 5s
b3df1fca11
Author
Member

Sorry, I also pushed some new changes on the branch that I did after the rebase. It includes treefmt a formatter for the whole tree in nix. I will put them in a separate branch though so we can have it in a separate PR.

Sorry, I also pushed some new changes on the branch that I did after the rebase. It includes `treefmt` a formatter for the whole tree in nix. I will put them in a separate branch though so we can have it in a separate PR.
Aviac force-pushed flake-parts-isation from b3df1fca11
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m27s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 7m20s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 8m21s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 8s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 11m58s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 18m9s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 18m13s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 5s
to c24862aa6d
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m12s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 8m47s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 8m53s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 6s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 11m47s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 18m40s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 18m43s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 6s
2025-10-16 08:45:36 +00:00
Compare
Author
Member

@Jade now it's properly rebased. I made one small addition though, which is the last commit. It adds hydra jobs which define jobs for the nix build server software hydra. I'm going to build it with my own build server so that we're constantly going to verify that everything on the nix side works.

@Jade now it's properly rebased. I made one small addition though, which is the last commit. It adds `hydra` jobs which define jobs for the nix build server software `hydra`. I'm going to build it with my own build server so that we're constantly going to verify that everything on the nix side works.
Owner

@Aviac feel free to mention me or request a review when you're ready for merge

@Aviac feel free to mention me or request a review when you're ready for merge
Contributor

I think the treefmt being part of this is fine, and maybe making a formatting check too? I think some of the commits should be squashed as well :3

I think the treefmt being part of this is fine, and maybe making a formatting check too? I think some of the commits should be squashed as well :3
Aviac force-pushed flake-parts-isation from c24862aa6d
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m12s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 8m47s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 8m53s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 6s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 11m47s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 18m40s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 18m43s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 6s
to 4631546e7a
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m3s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 11m29s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 11m29s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 12m11s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 47s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 16m56s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 15m52s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 31s
2025-10-18 14:02:45 +00:00
Compare
Aviac force-pushed flake-parts-isation from 4631546e7a
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m3s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 11m29s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 11m29s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 12m11s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 47s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 16m56s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 15m52s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 31s
to bc3db4c04d
Some checks failed
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m13s
Release Docker Image / Build linux-arm64 (release) (pull_request) Failing after 1m35s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 8m48s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Has been skipped
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Has been skipped
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Has been skipped
Release Docker Image / Create Max-Perf Manifest (pull_request) Has been skipped
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 13m18s
2025-10-18 14:06:58 +00:00
Compare
feat: add taplo.toml to check now that we have it
All checks were successful
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 3m7s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 7m13s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 10m54s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 7m5s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 49s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 23m37s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 14m53s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 15s
a8e07b6a3f
chore: run nix fmt
Some checks failed
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m38s
Update flake hashes / update-flake-hashes (pull_request) Failing after 44s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 9m1s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 9m38s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 24m30s
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 14s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 21m10s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 23m10s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 10s
5957c71b2d
Author
Member
  • squashed a lot of commits
  • added treefmt
  • did some formatting of the toml files (no CI yet though, but also wasnt checked before anyways so 🤷)
  • rebased

@nex I'm basically done

@nyanbinary just pinging you FYI

- squashed a lot of commits - added treefmt - did some formatting of the toml files (no CI yet though, but also wasnt checked before anyways so 🤷) - rebased @nex I'm basically done @nyanbinary just pinging you FYI
Some checks failed
Documentation / Build and Deploy Documentation (pull_request) Has been skipped
Checks / Prek / Pre-commit & Formatting (pull_request) Successful in 2m38s
Required
Details
Update flake hashes / update-flake-hashes (pull_request) Failing after 44s
Release Docker Image / Build linux-arm64 (release) (pull_request) Successful in 9m1s
Release Docker Image / Build linux-amd64 (release) (pull_request) Successful in 9m38s
Checks / Prek / Clippy and Cargo Tests (pull_request) Successful in 24m30s
Required
Details
Release Docker Image / Create Multi-arch Release Manifest (pull_request) Successful in 14s
Release Docker Image / Build linux-amd64 (max-perf) (pull_request) Successful in 21m10s
Release Docker Image / Build linux-arm64 (max-perf) (pull_request) Successful in 23m10s
Release Docker Image / Create Max-Perf Manifest (pull_request) Successful in 10s
This pull request can be merged automatically.
You are not authorized to merge this pull request.
View command line instructions

Checkout

From your project repository, check out a new branch and test the changes.
git fetch -u flake-parts-isation:Aviac-flake-parts-isation
git switch Aviac-flake-parts-isation
Sign in to join this conversation.
No reviewers
No milestone
No project
No assignees
6 participants
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
continuwuation/continuwuity!1096
No description provided.