<sss1>so my problem is, luks support in grub is limited (on time of my last test). and i want to use luks2 + argon2id which is not supported in grub yet ?
<nckx>The 'BIOS boot partititon' is a few 100K of raw bytes (no file system) that stores the fat of GRUB that won't fit in the traditional 'pre-MBR gap'. It can't be mounted, at /boot or anywhere else.
<nckx>sss1: Then you're out of luck with the mechanisms currently provided by Guix.
<attila_lendvai>how are tests run when cross-compiling a package? i'm reading that dependencies for tests should go into native-inputs, but the host won't be able to run the tests when cross-compiling...
<iskarian>attila_lendvai, regarding pinning versions for go-ethereum: it's a tough call, because it means we have more package versions to maintain. Maybe it would be best to stick go-ethereum and all the version-specific packages in a separate .scm file?
<sneek>iskarian, maximed says: About the git-fetch patches: I've written those let& and let*& macros and let them replace the 'let' and 'let*' macros and it seems to work (with some changes). I should be able to submit them sometime this week
<attila_lendvai>iskarian, yep, that makes sense, to add a go-ethereum.scm with all the pinned stuff in it. but is it The Right Way to do it in the long term? to add countless go packages using guix import go?
<attila_lendvai>iskarian, or hacking more on the other way where go itself downloads the dependencies (see my recent mail to guix-devel)
<iskarian>I don't know. This is one of those places where the Go methodology and the Guix methodology really conflict. Go wants to have a million versions of everything, forever available, and to allow any package to mix-and-match. Guix wants the minimal number of versions of something available.
<attila_lendvai>i'm willing to do either of them, but i lack the bird's eye view perspective to decide
<attila_lendvai>iskarian, that sounds like the second way, to allow go to fetch stuff, and guix to check the hashes
<iskarian>How do we check hashes if it's not already in Guix? ;) We're not just going to trust a hash from a proxy
<iskarian>Go provides a facility for providing a go.sum file which contains a hash of the repository; but because it's in-channel it only protects against corruption in transit, it does not solve the issue of "how do I know the source repository hasn't been tampered with?"
<attila_lendvai>iskarian, yep. allow go's package manager to fetch all the things, and then calculate/check a hash of it
<iskarian>This would be a new way of doing things for Guix; currently, all packages are separately verified, compiled, tested, and so on.
<iskarian>In addition you end up with duplication of effort when a dependency has to be modified for Guix, with the same change being copied to all packages using it.
<attila_lendvai>what's "the derivation hash"? the hash of the binary output? i may be misunderstanding/misrepresenting what's described on that issue.
<attila_lendvai>iskarian, i don't understand your last remark about duplicate effort. this way i just calculate the hash, add it to the go-ethereum package, and leave everything to go. guix packaging of go stuff becomes completely orthogonal to this.
<attila_lendvai>note that NixOS does the same, i think, although it puts all the dependencies into a tgz and puts it into the store/cache, so that vendoring is memoized
*attila_lendvai hasn't learned the proper nomenclature yet
<iskarian>Let's say that golang.org/x/net requires a patch to work correctly on Guix. Then if we start packaging Go packages with all their dependencies as part of the source, then every package which uses golang.org/x/net will have to copy-and-paste that patch. Currently, only go-golang-org-x-net package would have to be patched.
<iskarian>Additionally, this means that there will be a copy of the source for golang.org/x/net in every package which uses it, rather than one copy.
<attila_lendvai>iskarian, i think such patching will be a rare thing with go, but this is only an impression
<attila_lendvai>iskarian, of the *source*? guix also stores the source of everything in the store?
<iskarian>(And not only a copy of the source, but the dependency will have to be fetched for every package.)
<vagrantc>well, you could create "packages" that only ship sources and patch them once, and use them for various inputs
<iskarian>That's essentially what Go does currently, except most of these also build and test themselves
<iskarian>most Go packages in Guix actually only put their source in output; no compiled artifacts
<attila_lendvai>iskarian, out of curiosity, do you know how go vendoring is done on NixOS? because i don't really know the details, and i'm wondering whether that could/should be "ported" to guix...
<iskarian>I doubt that Nix's methodology would square with Guix's
<iskarian>(I'm not familiar with the Go effort in Nix)
<iskarian>Yes, and to be clear, I'm not trying to shut you down, but rather argue the harder points first to see if there is a way forward
<iskarian>It would be lovely to not have to handle the mess of Go(/Rust/...) dependencies in Guix, but the current trajectory seems to be to package dependencies individually
<iskarian>(I am a relative newcomer to Guix myself; only been around a few months)
<attila_lendvai>how do i see which go packages are not built, merely sources? all i see is (build-system go-build-system) in golang.scm, which i assume means compiling them. which brings the question, what if a project wnats to be compiled with a different version of go itself?
<iskarian>Some packages (like go-golang-org-x-net) use #:tests? #f and delete the build phase, and those are definitely merely sources. However, I would say the vast majority of "go-..." packages only have source in their output
<iskarian>To compile a package with a different version of go, say "go-1.16", use (arguments `(#:go ,go-1.16))
<attila_lendvai>iskarian, indeed. i just checked go-github-com-apparentlymart-go-openvpn-mgmt and its output only has sources. i don't understand why, though.
<iskarian>Currently only go-1.14 and go-1.16 are packaged in Guix
<iskarian>I have a patch for go-1.17 which I haven't yet sent for various reasons, but soon(tm)
<vagrantc>doesn't go staticly compile everything? so for any go library, you would just need the sources available and not a compiled library ... unless i'm missing something
<attila_lendvai>but how come i've seen some tests failing? the go-build-system by default tries to build it, run the tests, but only package the source when it succeeded?
<iskarian>Okay, so. Currently, every dependency is a separate Guix package. Each dependency is treated just like a normal Guix package, and since Guix builds and tests all the inputs to a package before building that package, all those dependencies are built and tested.
<iskarian>However, because the Go build system has not seen a lot of love, no non-executable artifacts are saved, only source is copied into the output.
<iskarian>(which works, because Go needs the source of all dependencies, but build artifacts just serve as a cache)
<attila_lendvai>iskarian, i think saving binaries may not a good idea. e.g. same dependency is used by two packages that need to be compiled with a different version of go. (not sure whether linking those is supported by go)
<iskarian>attila_lendvai, you're roughly correct, it would be wasted space; but it wouldn't hurt, since Go just treats them as a cache and recompiles them if it would produce different output
<attila_lendvai>tools like go-ethereum should be reproducible builds, and compiling a random dependency somewhere with a different version of go may lead to a different binary result
<iskarian>It doesn't save much time anyway, so probably not worth it
<attila_lendvai>(keep in mind though, that i'm very new to this. add pieces of salt as necessary... :)
<iskarian>when you say "reproducible builds", do you mean the Guix package should be reproducible by different people using the same definition, or that the Guix package should produce the same output as someone compiling from source on a foreign distro?
<iskarian>I vaguely recall an effort to overhaul the Rust build system/ecosystem. Does anyone know who might be behind that?
<iskarian>sneek, later tell maximed: The implementation goes over my head, but the overall approach seems sound. I still think the (let (...) (package ...) idiom feels clunky and should be replaced with something else, though. Also, I think https://issues.guix.gnu.org/50274 might have the git-fetch updater effort in mind? :)
<iskarian>sneek, later tell attila_lendvai: One issue you may encounter with reproducibility with go-ethereum is that the Go build system does not use modules (yet!) and I believe Go embeds the module version of dependencies in built artifacts, so the result may differ if go-ethereum is supposed to be build in module-aware mode.
<sneek>clone1, bricewge says: By any luck, do you still have the code from which you submitted #46907? It's inapplicable with git “error: corrupt patch at line 15”. Would you mind re-sending the patch?
<apteryx>clone1: nothing clean cut is readily available, but there was an attempt based on package-with-python that got close (I tried it, had some issues)
<apteryx>with motivation it could probably be made to work
<efraim>iskarian: yeah, that's me. My plan was basically cargo-inputs -> inputs and cargo-development-inputs -> native-inputs, try to find any circular dependencies among the ~2500 packages, and see about disabling tests and removing some really old and potentially unneeded packages that got pulled in through cargo-development-inputs
<mothacehe>apteryx: many thanks for fixing the python path & repack fixes, I'm currently testing them :)
<attila_lendvai>iskarian, thanks for the importer fix! i ran it on go-ethereum: ./pre-inst-env guix import go -r --pin-versions firstname.lastname@example.org >/tmp/x.scm but apparently it's possible to refer to subdirectories in go.mod...
<sneek>Welcome back attila_lendvai, you have 1 message!
<sneek>attila_lendvai, iskarian says: One issue you may encounter with reproducibility with go-ethereum is that the Go build system does not use modules (yet!) and I believe Go embeds the module version of dependencies in built artifacts, so the result may differ if go-ethereum is supposed to be build in module-aware mode.
<attila_lendvai>iskarian, these are entries in the go.mod of go-ethereum: github.com/aws/aws-sdk-go-v2 v1.2.0 ; github.com/aws/aws-sdk-go-v2/config v1.1.1 ; github.com/aws/aws-sdk-go-v2/credentials v1.1.1
<attila_lendvai>iskarian, these are subdirectories of the repo, and i guess the dependency means a checkout of only that directory, but at the specified version
<cage>roptat: semms that yours was a good suggestion, i have inspected the directory under guix-build and i noticed a file named "environment-variables" if i try the configure script after "source environment-variables" the configure script fails with the same error as guix build
<apteryx>hmm, the "#:tests? must not be explicitly set to #t" lint check doesn't bode well with our Emacs build check phase (which is disabled by default).
<cage>roptat: without "source environment-variables" the configure passes
<roptat>it could be a missing dependency, so guix cannot set the right env vars in the build environment
<roptat>which variable is missing? then you can find which package provides a definition for it and add it to the package inputs
<roptat>note that the build environment is completely isolated from the host system, so it can only see what you declare in the recipe, it doesn't care what's installed on the system or in your user profile
<iskarian>sneek, later tell attila_lendvai: Currently, you would make three packages for those aws repos, and for the latter two use '#:unpack-path "github.com/aws/aws-sdk-go-v2"'. AFAICT "guix import go github.com/influxdata/influxdb-client-go/v2" works correctly; what's the issue?
<apteryx>good news! the rust bootstrap will be reduced to ~4 hours on core-updates after the patches for bootstrapping from 1.39 lands. That's 25% of the time it would take on the master branch (~16 hours).
<iskarian>thanks, I'll have to do some mailing list archaeology (anthropology?) :)
<lfam>For a while I thought we could have a Go-specific method of instantiating Go dependencies based on revisions. So each Go library would have a canonical package, but then you could concisely instantiate a different revision when using the library in a package that uses it, without a lot of boilerplate
<iskarian>I've been playing around with different overhauls of the build system which uses modules, but like you said, there are some fundamental disconnects between Go and Guix-land that make it difficult without bolting-on extra machinery
<lfam>That message is somewhat obsolete too. My understand was still very primitive at that point. I will see if I can find a more recent summary / proposal
<lfam>Do you think the disconnects are largely in terms of dependency management? And lack of "versions" in Go?
<iskarian>Rather, I think it's due to too many versions in Go
<lfam>The reason I worked on Go for a while is that I wanted us to have a Syncthing package. I did a ton of work unbundling the dependencies, massaging their custom build.go, and landing the go-build-system, but now we are just using the bundled depenencies. They are all free software of course, so it's basically fine although not idiomatic for Giux
<lfam>I know that bundling / vendoring is considered a Bad Idea by distros, and in general I agree with that position. But for Go, I now think that there is little or no value in unbundling, considering the effort it requires
<lfam>What would it do iskarian? Something different from `guix import go foo`?
<iskarian>I mean, in a package, rather than git-fetch or url-fetch, you would have go-fetch
<iskarian>That could intelligently download dependencies, and still leave us with a static source
<lfam>Also, to zoom out, the entire set of values that inform what distros think about bundling should be understood in the context of the history of distros and distro tooling. This context has obviously changed since these values were developed and transmitted throughout the community decades ago
<lfam>iskarian: So, it would read go.mod, fetch everything, and put it into a single tarball or directory tree?
<iskarian>Essentially, yeah. Maybe it would fetch them all into separate store items and symlink them, so we get *some* de-duplication of dependency source
<lfam>It would be nice to fetch them, in order to "trust but verify" upstream's bundling, especially if we are working from tarballs, like in Syncthing
<lfam>At this point I'm really only familiar with Syncthing, which is somewhat atypical. Like I said, they use a custom build script
<lfam>It keeps the dependency graph local to the package
<lfam>It's been a while since I thought about this stuff
<lfam>I wonder about transitive dependencies and how they would be resolved. I don't recall / understand how this is handled idiomatically in Go
<iskarian>I'm not sold on it yet. Since nearly all Go dependencies will just have source in their output, is there value in having separate packages with source and output for them, even if they're only local to the package?
<lfam>I think there is value in terms of the Guix user interface. Like, `guix show` et al
<iskarian>With this approach of manually specifying dependencies, I think we'd want to specify all transitive dependencies manually.
<moshy>zacchae[m]: Then it might not have been detected in an installer in my case. I'm assuming you have ath9k on your Librem Mini 2, in theory could be supported under hurd. Although can't guarantee it
<iskarian>apteryx, can you expand on "I like that packages are defined as packages"?
<iskarian>lfam, what do you think about testing inputs specified with your proposed go-package?
<iskarian>You can do "go build ./..." or "go build github.com/some/package/..."
<lfam>I discussed this in one of those emails that I linked to, I think
<lfam>I just don't think it's something that is done
<lfam>At least that was my impression a couple years ago
<iskarian>Sure, Debian does it. Given an import path "example.com/package", Debian's dh-golang runs "go list example.com/package/...", then removes any packages you specify to exclude, then runs "go install" with that list of packages
<apteryx>iskarian: I meant that I don't see a reason, from my limited experience with Go packaging, why it'd be superior to have them represented as some other data type than a 'package' record from (guix packages), especially given all the complications it'll cause (as lfam hinted at, the tooling we've come to rely on).
<lfam>That is why, initially, Guix's Go packaging created a package per import-path: to build them. But later I realized that this was seriously unidiomatic and was causing difficulties for packagers, so we decided instead to package entire repos. I think it's important to try to work idiomatically or else you can't get any help upstream
<lfam>And when packaging a Go Git repo, you can't just build every command / library with a single command
<lfam>I see that Debian has addressed this somehow
<lfam>I guess I don't see the value. Either the library builds successfully while building the dependent application or it doesn't
<lfam>I don't think that upstream application developers do this however, so we'd be doing something extra and should weigh the cost and benefit accordingly
<iskarian>However, if we really wanted to do this testing, we could instead test all dependencies in the end-user package
<lfam>Zooming out, one of Guix's values is that we try to provide what upstream intends to distribute, as much as possible. We don't do significant development of packages (like Debian), and we don't change defaults unless we have to
<lfam>This doesn't fully translate to the process of building packages, especially libraries, but it still should be given some weight
<lfam>So if the upstream development and deployment workflow does not run tests, we shouldn't feel obligated to run them
<lfam>Good muradm! Back from a crazy work week. I saw your message about running the tests
<lfam>iskarian: But, that's not a reason to not run them. Just something to keep in mind
<lfam>A lot of upstreams, especially in new languages like Go, do not see much value in the distro model. So we should try to work in a way that does not piss them off too much, whatever that means in practice :)
<dstolfa>Applications should note that the standard PATH to the shell cannot be assumed to be either /bin/sh or /usr/bin/sh, and should be determined by interrogation of the PATH returned by getconf PATH
<attila_lendvai>iskarian, thanks for looking into it! i only have your redirect patch locally. i'll read up those two issues you linked to. meanwhile, these are the failing ones for me: guix import go --pin-versions email@example.com and guix import go --pin-versions firstname.lastname@example.org
<sneek>attila_lendvai, iskarian says: Currently, you would make three packages for those aws repos, and for the latter two use '#:unpack-path "github.com/aws/aws-sdk-go-v2"'. AFAICT "guix import go github.com/influxdata/influxdb-client-go/v2" works correctly; what's the issue?
<attila_lendvai>iskarian, the former is eventually initiated by the latter, which takes a couple of minutes
<attila_lendvai>iskarian, there's no issue with the /v2 stuff. that was a haphazard remark from me, ignore it.
<iskarian>attila_lendvai, `guix import go --pin-versions email@example.com' works for me with that redirect patch...
<civodul>apteryx: hi! just saw a message of yours earlier today: i don't think "guix pull --list-generations" should display "indexing objects", that looks fishy
<the_tubular>Anything that is container / vm related really impress me
<zamfofex>Hello, Guix! Is there a way to build GCC for x86 from x86‐64? I’m trying to compile a program that only works on x86 (32‐bit), and it requires libgcc. I tried ‘guix build guix build --target=i686-linux-gnu gcc-toolchain’, but it failed at the “configure” phase with an error telling me that “the C compiler can’t create executables” while cross‐building coreutils. If I’m doing things wrong, any suggestions are appre
<roptat>like return (result == (CPNATIVE OK && entryType) == CPFILE_FILE)? 1: 0;, but not sure
<roptat>also fun fact, a boolean is an integer, so that's not a type error
<dstolfa>roptat: == > && > ?: in C, so you'll end up with ((result == CPNATIVE_OK) && (entryType == CPFILE_FILE)) ? 1 : 0, and i sadly know this because a lot of code does really annoying stuff by omitting parentheses it should never omit by trying to be "clever"
<dstolfa>apparently being clever implies that you have to make the code impossible to read quickly to some programmers...
<roptat>mh... I thought I saw that issue with == and &&, but maybe that was a different language
<zamfofex>I think it makes sense that logical boolean operators would bind more loosely than comparison operators.
<roptat>it could just be that gcc is better at optimizing on core-updates than on master, it's not the same version, is it?
<dstolfa>zamfofex: that would parse, though the behavior is hilarious
<dstolfa>zamfofex: IIRC, what would happen here is you'd get c or d depending on a being true
<zamfofex>More “on topic”, it seems like the package ‘gcc-toolchain’ doesn’t actually include ‘gcc:lib’. Is there any way to refer to ‘gcc:lib’ from the CLI? Or otherwise be able to figure out a way to download the 32‐bit version to the store?
<zamfofex>dstolfa: Note that if “b” were “b()” or “b++”, it would be evaluated for its side‐effects.
<dstolfa>zamfofex: i've seen code like w = z = ++y, q = ++x; before quite often. it does what you'd expect but my god why not just use a semicolon
<civodul>right after the ENOENT from cpio_checkType
<zamfofex>dstolfa: I think the problem (in my case) is that the semicolon would end the “if” statement. To use it, I’d need to wrap the statements in curly braces, which is advantageous if there are more than two or three short statements, but I think it adds unecessary clutter if the statements are short enough. Also: If you want to continue, I think it’d make sense to do so in a DM, since I think this is a bit too off‐topic here.
<dstolfa>fair, we can just end it here since it was just a fun little convo, coming up with C monstrosities is indeed off-topic :D
<zamfofex>It was fun, yeah! I always enjoy talking about syntax and whatnot.
<civodul>important fixes go to master, possibly as grafts
<zamfofex>In case anyone comes across this conversation in the future (since this is a question I have had multiple times before), in order to find the store path of ‘gcc:lib’, it suffices to run “gcc --print-file-name=libgcc.a”, and it should give you a path within the store entry for “gcc:lib”. In my case, I had to run that for the 32‐bit GCC from the store.