IRC channel logs
2021-11-19.log
back to list of logs
<efraim>I can try to take a look at Julia in the next few days <efraim>julia built on core-updates-frozen for me <zimoun>civodul: on machine B, guix build julia, first try fails at test as report in #guix with sneek. :-) Second try passes. Hum!? <zimoun>civodul, nice slide1 Especially the very first one :-) <civodul>so tests fail non-deterministically, right? <civodul>(i recommend looking at the CONDA and Spack slides above) <zimoun>I seems… Well, it looks like hard time… because the Julia test suite is so time consuming. <rekado>looking at the conda slides: the comparison of package numbers in Guix says “cf. Guix: ~19k packages[1], many of which automatically imported” <civodul>i guess they got that idea from the tweet their refer to <rekado>the spack slides are informative <rekado>good overview on what they do and how <civodul>i was surprised that the CONDA talk had "reproducible paper" first and foremost <rekado>“automatically imported” is what I’d call the R packages in Nixpkgs. <rekado>or what guix.install in R does for Guix <civodul>well, our R imports are also largely automated, but curated and tweaked i'd say <rekado>but I’d still not claim the thousands of CRAN and bioconductor packages :) <rekado>it’s a distinction without a difference: when I use a yasnippet in Emacs — is this “automated”? <civodul>the more curation happens in the upstream repo, the less work for us <civodul>in that respect, CRAN seems to be doing a better job than PyPI <civodul>for instance it contains only free software, right? <zimoun>ahah! Conda slides are *exactly* why we have Replication Crisis. It adds mess instead of explaining where does come from the issue about compational environment and try to fix. Sigh! <zimoun>Something that I always find weird when we speak about reproduce a computional environment. If the package manager depends on resolver for dependencies, then, because resolvers have barely unique solution (at least, there is no guarantee), the reproducibility is broken by design. <zimoun>So why are they explaining at length it fits reproducible framework. Arf, buzzword surf ? *civodul senses bitterness :-) <civodul>i think the core of their demonstration is: binaries are kept forever, therefore things are reproducible <civodul>which at first sight kinda makes sense <civodul>our explanation has the drawback of being more verbose than this <rekado>the decision to use a resolver is something I cannot follow. I’m sure they have good reasons for this, of course. But it just seems like a really difficult beast to tame. <rekado>my colleague installed Galaxy tools, which are really just large Conda environments. And the solver ran for 4h, no exaggeration. <rekado>this can probably be optimized. Spack uses solvers, too, and from what I understand they try hard to optimize things. <rekado>but … is this really a good idea for Conda in general? <zimoun>civodul, about kept forever, many facts prove it is a wrong asumption. It is already wrong about source code (For instances, SWH. :-)). Why would it be a valid asumption for binaries? <zimoun>Yes, a bit of bitterness. :-) Well, I am fine to say: hey this tool just works. But I am not fine to say: hey and in addition, this tool is a solution for that. When the design of the tool cannot make the second statement true. <zimoun>Especially by scientists; because either they miss reviews of what others do and why they do so or either they are unethical. Aside that the World is full of concrete examples that the statement is just false. :-) <rekado>zimoun: binaries *can* be kept “forever”. Unlike source code these binaries are under the control of the project. <rekado>these binaries have been built once and have never been touched again <zimoun>hehe! Well, I do not see why what happens to source code *cannot* happen to binaries. :-) <zimoun>Darwinism of internet: projects are created and then eventually vanish. <rekado>I think it’s horrible to have a simple stash of binaries but not the associated source code. <rekado>so even if it didn’t happen to binaries it wouldn’t be good enough. <rekado>I’ve seen archives of statically linked binaries that end up on tape. <civodul>zimoun: i'm sure they do keep binaries "forever"; it's easy, you just need to pay Amazon and keep filling disks <civodul>solvers are used to choose appropriate versions based on what's already installed, right? <civodul>(i've spent way too much time on functional package managers :-)) <zimoun>civodul, yes. Basically SAT solvers. <rekado>so by adding a package to your environment you can accidentally change all other packages. <zimoun>you have a constraint set and they the solver tries to find the combination satisfying all the constraint <zimoun>about “forever”. Nah, it happens now because we are able to waste resources. I am not convinced that 1. it is a wise strategy, scientifically speaking; and 2. future-soon resource-less world will switch this paradigm of wasting. <zimoun>Bah, ok I stop to ramble on. :-) And I am going back for real work fixing this broken world. ;-) <civodul>rekado: i remember that from my Debian days but in hindsight that seems totally crazy to me :-) <civodul>those days when "apt-get install hello" would kindly inform you that glibc would be upgraded and xfree86 (!) would be removed <rekado>yeah, I have the same feeling. It’s hard for me to change my perspective to see this as a good idea. <rekado>I can see how it may seem *necessary* <civodul>i suppose it's an optimization that made sense when bandwidth was low <civodul>because it could potentially allow you to download just the bare minimum <rekado>but especially in the context of reproducibility it seems counterproductive at leats. <civodul>yes; yet it's presented as one of the big features of conda/mamba/spack <rekado>with spack I can see how it makes sense <rekado>but conda is all about installing binaries quickly <zimoun>I have tried to explain what is the problem <rekado>I’ve read it before and I think it’s great1 <zimoun>ah cool! I am loosing my mind if I already sent it. :-) <rekado>(I think it may need some editing to have it “flow” a little better, but editing is rarely easy) <rekado>rubygems had “lock” files (just like yarn and other JS package managers) to essentially record the installed package versions. <rekado>like our generated “manifest” files in profiles <rekado>and you could restore the dev environment by letting the package manager just read that lock file <rekado>does Conda use something similar? <zimoun>rekado: yeah, thanks. I will put it somewhere for collective edition. :-) <rekado>in the case of Galaxy, for example, it seems insane to me to let Conda have any say in what packages exactly to install when the whole point is to install a well-known set of packages. <zimoun>civodul: do you know if a Scheme interpreter is implemented in Coq? <rekado>when binaries are “forever” archived and uniquely identified then a lock file like this solves a *lot* of problems already <zimoun>yeah, Julia Pkg uses something like that; with some UUID. <rekado>doesn’t help with source->binary transparency which is a non-negotiable requirement for reproducibility, but it would be better than getting a mystery environment depending on how the solver feels today. <civodul>i have troubles with "lock files", the very name makes it sound obscure to me <rekado>then they build the environment which generates the Gemfile.lock <rekado>and that’s roughly the equivalent of a generated profile manifest <civodul>it's interesting because that lock file doesn't lock much: there are still many degrees of freedom <rekado>I may be missing something but i think the Gemfile.lock is complete <rekado>yes, you’ve got constraints in the more detailed levels, but AFAICT everything has been hoisted up a level as a concrete fixed version <rekado>why bother with that deeper level? I don’t know. <rekado>that’s not news, though, is it? I first heard of Nix through Haskell, actually. <rekado>it has a very strong connection to the Haskell community <rekado>(that’s also what I found unpleasant, because many Haskell users are … difficult.) <zimoun>yeah, nothing new. But numbers are increasing. <drakonis>it very much does not come as a surprise that nix is highly haskell adjacent <drakonis>since it appeals to haskell's brand of programming <drakonis>the other thing that's important in the equation is that haskell is incredibly well supported by nix <drakonis>they make it incredibly trivial to access the entirety of hackage