IRC channel logs


back to list of logs

***Noisytoot is now known as [
<zimoun>civodul: for the stable branch, yes I am currently considering it. Maybe via GuixHPC. I do not know, I have to do some stats to estimate the effort. For sure, basically if an user does not have a Guix savvy person around, it is really hard to work with it, professionnally speaking.
<civodul>do you have in mind missing substitutes, broken packages, or other things as well?
<zimoun>for now, my rule of thumb to detect broken packages is to check missing substitutes. :-)
<zimoun>what I have in mind it is: once a week, list the additions and updates, check with CI, cherry-pick.
<zimoun>it is some work to have a script and manage potential Git conflicts
<zimoun>but I think it would be worth for attracting more users from academia.
<zimoun>in professional context, people need
<zimoun>…stability, somehow.
<zimoun>civodul, maybe Vagrant could tell more but for instance and from my understanding, Guix is currently “unstable” (basically as for Debian, it mostly works) and I would like that Guix has a “testing”; if it works fine enough on “unstable” then it goes to “testing” which means it just works (modulo some bugs).
<zimoun>I exposed the thing I have in mind here almost one year ago.
<zimoun>If one considers “complex” package used in scientific context, say openfoam or freecad.
<zimoun>Look at the rate of changes. It implies 2 things: one it is high probable that an unrelated change, it breaks and two it consummes many CI ressources.
<zimoun>Last, something moving less fast allows third-party channels to be based on and then follow
<civodul>i think a "stable" branch that is stable but not frozen represents a lot of work
<civodul>i'd rather fix broken packages (or prevent breakage in the first place)
<civodul>but maybe it's worth experimenting in that direction too
<zimoun>the first time I heard of Q&A was the first time we meet, back on December 2018. :-) Many things have been improved since then \o/ But not the ratio of broken packages.
<zimoun>My estimation is that this ratio is 5% of the total collection
<zimoun>Therefore, it is possible to fix 5% of 10 thousand packages. But it is becoming harder and harder to fix 5% of 20+ thousand packages.
<zimoun>My point is: the current workflow cannot scale up, IMHO.
<civodul>right, but additional tooling can help, we're still way behind what others do
<zimoun>yeah for sure, but “stability” does not seem the top priority of the project. No blame, moving fast and being up to date brings many other cool features. :-)
<civodul>"the project"?
<zimoun>and more tooling does not change that we directly push to master and we pull from master :-)
<zimoun>I think this proposal would help. And it is not the first time someone is proposing similar workflow; I remember Hartmut doing among others.
<civodul>yeah, i'm all for a true time-based core-updates/staging/master process
<civodul>what we need is someone to champion each cycle and make sure we stay within bounds
<civodul>so far, there hasn't really been that "someone"
<civodul>often because the people involved are too involved; hard to be time-keeper and hacker at the same time
<zimoun>I think it is more complicated than that.
<mbakke>I've been considering a "stable-updates" branch for core-level (but "safe") updates such as patch releases of Python, curl, nghttp2, pcre, etc ... it would fall in between "core-updates" and "staging", and replace the "ungrafting" cycle.
<mbakke>there would still be regressions of course, but at a much more manageable level
<mbakke>this "core-updates" round we fought regressions with GCC, Python, "sanity checks", Meson, etc, all at once
<mbakke>or perhaps all those should rather be topic branches :)
<mbakke>it takes a lot of experience to determine which updates are "safe" or not
<zimoun>heh, I was typing: how do you determine what is “safe”? :-)
<zimoun>rekado: have you tried to build R using musl instead of glibc?
<zimoun>for instance slide 2 and 3 and yesterday I was trying to show similar.
<zimoun>It would be a good use-case for showing that capturing all the environment matter when doing science.
<zimoun>on a side note, many people use Alpine as Docker base image in their Dockerfile. And Alpine uses musl as libc.
<rekado>zimoun: no, never tried using musl for anything
<PurpleSym>Any ffmpeg wizards here? Trying to record my talk for Guix Days, libx264rgb is darkening the image quite a bit. Not sure why.
<PurpleSym>The command is pretty much copy&paste from ffmpeg’s wiki: `ffmpeg -y -window_id 0xa800006 -draw_mouse 0 -framerate 25 -f x11grab -i :0.0 -c:v libx264rgb -crf 0 -preset ultrafast -color_range 2`
<civodul>zimoun: that Café Guix was rather nice, wasn't it?
<civodul>though i'd love to hear people a bit more
<civodul>i'm tired of those lists of names on my screen
<civodul>i want to see real people :-)
<zimoun>hehe! yeah physical events are missing. ;-)
<zimoun>yes, the Café Guix was nice.
<zimoun>Does Nive have inferior and time-machine?
<rekado>PurpleSym: I can’t help, I’m afraid.
<rekado>I used ffmpeg successfully for recoding video in the past, but it’s always been really difficult to find the right settings
<PurpleSym>Worth a shot, thanks anyway rekado .
<rekado>in the maintenance repo in talks/icg-2018/Makefile there’s an ffmpeg command for building a video from slides
<mbakke>PurpleSym: lfam on #guix is our resident ffmpeg expert :)
<civodul>zimoun: what's Nive?
<rekado>couldn’t find anything in my eternal bash history
<PurpleSym>rekado: I used that last time, but wanted to get fancy this year and draw on the slides using xournalpp :)
<zimoun>civodul: Nive = Nix + typo ;-)
<civodul>oh :-)
<civodul>in the Nix language you would refer to a Nixpkgs tarball for a specific revision
<civodul>that's how you would to time-machine and/or inferior AIUI
<rekado>so, I built pytorch 1.9.0 and liblantern for r-torch. But when I symlink the built to the location expected by r-torch it doesn’t work.
<rekado>it says: /gnu/store/kv80jv3dvg3a9g6aj4dzqicmbpak1b2w-r-torch-0.6.0/site-library/torch/deps/ undefined symbol: _ZTIN3c104TypeE
<rekado>I find that a little confusing. Why would it be missing a symbol?
<rekado>that should be a build-time problem, no?
<civodul>could be because it saw its declaration, but the thing is actually missing from the DSOs it depends on
<civodul>what does that demangle to? :-)
<rekado>no idea!
<rekado>ldd says that the library only depends on the usual stuff:,,,,, and the loader
<rekado>must be an error in the build system. This should link with libtorch, I think, but it doesn’t
<zimoun>civodul: but transformations (as Nix does) is not the same as time-machine, because it is impossible to do by hand all the source tarball rewrite.
<civodul>zimoun: i think Nix lets you do something equivalent to time-machine, but with a clunky interface
<zimoun>civodul: hum? I would like to see because for me time-machine is unique. :-)
<civodul>in Nix language you can import code from a URL
<civodul>so you can do (roughly): "import"
<civodul>and then you can evaluate code therein
<civodul>but they make a different choice regarding coupling: Nixpkgs and Nix and relatively decoupled
<civodul>so, to some extend, a new Nix can evaluated an old Nixpkgs
<civodul>which is nice
<civodul>but to some extent only
<civodul>because the Nix language evolves
<civodul>it's a fine point that few people care about it seems, but one with a real impact
<zimoun>thanks for explaining. But that’s not time-machine. Because time-machine provides (or least tries hard!) the same bit-to-bit (bug-to-bug), i.e., all the build infrastructure is the same, and inspectable.
<zimoun>If Nix changes something, then Nixpkgs is different, it is hard to know where does the difference come from? Because Nix change? Because Nixpkgs? Because butterfly effect? etc.
<zimoun>And as you said, time-machine fixes backward compatibility issues. We could change some API (new new inputs style or new DSL using Wisp) without breaking the past.
<civodul>yes, exactly
<civodul>we don't pretend packages and core are loosely coupled
<zimoun>now we are saying that, I have never thought that both work as a compiler; compiling a (domain specific) language to binaries. We include the source with the compiler itself; otherwise the door is open for many issues: standard, compatobilities, etc. Since it is domain-specific, it makes sense (at least to me!) include all in the compiler.
<zimoun>Ouch, friday…
<civodul>it's discussed toward the end of
<civodul>in case your head doesn't hurt yet
<zimoun>bah you said all! :-)
<zimoun>«may support “time travels” over longer period of times. Time will tell!»
<zimoun>we are starting to have answers about that. And the recent OpenBLAS bug shows us that the issue is not from where it could have been expected. :-)
<civodul>yeah, it's a different class of problem