IRC channel logs


back to list of logs

<drakonis>civodul: y'know, i can never get that to work
<drakonis> links have tls errors here
<wehlutyk[m]>Hello Guix-hpc :)
<wehlutyk[m]>In free time I'd like to start looking at what's involved in getting guix-daemon running without root privileges
<wehlutyk[m]>But I would very much appreciate guidance on this, whichever the level possible
<wehlutyk[m]>Can someone tell me where I should start looking, i.e. what parts of the code are the most important for this?
***sebbu3 is now known as sebbu
<drakonis>wehlutyk[m]: hmm, look at the directory "nix" on the root of the repository
<wehlutyk[m]>drakonis: thanks
<drakonis>a guile rewrite would be nice
<wehlutyk[m]>I see it basically defers to settings.guixProgram, which is `canonPath(getEnv("GUIX", nixBinDir + "/guix"));`, but can't find where that executable then comes from
<rekado_>I’m in the process of replacing the CentOS instance hosting with a Guix System VM that runs the hpcguix-web service.
<rekado_>it seems to work superficially, but it never finishes generating the packages.json file
<rekado_>I see a ‘git’ directory and a packages.json.lock file in the cache directory, but they haven’t changed in hours.
<rekado_>the log is also empty
<rekado_>oh, maybe I just ran out of space…
<rekado_>yeah, that was it.
<rekado_>zimoun: I think we should start moving to bioconductor git soon. I’d like us to benefit from the SWH archive as soon as possible.
<civodul`>fun fact: a colleague was running "guix pull" and it would take ages (> 10m) whereas for me, on the same cluster, it completes in 2m
<civodul`>turns out they were upgrading from a nov. 2019 revision
<civodul`>so i guess it had to redownload/rebuild dependencies of compute-guix-derivation.drv
<rekado_>could we avoid this computation in the best case by having a database where we can look up these guix derivations?
<civodul`>but it causes trust issues
<civodul`>it would be ideal if we could use substitutes for that
<zimoun>rekado_: yeah, moving Biconductor to git-fetch could be nice. But now, Disarchive starts to work… Well, for sure, it would simplify the content-address mechanism
<zimoun>civodul`: ah you see about derivations. :-) And it is worse on laptop. And worse than worse on old hardware.
<zimoun>we have to set a policy as explained in the recent thread. :-)
<civodul`>zimoun: what i described isn't related to hardware speed, though
<civodul`>but yeah, terrible
<zimoun>I agree. You are saying it is terrible on “recent” cluster. I say, it is worse on laptop. And worse than worse on old hardware. :-)
<zimoun>rekado_: about bioconductor switch, here a recent report about what is in SWH: <> In case you missed from guix-devel :-)
<civodul`>oooh, nice!
<zimoun>Details here <> :-)
<civodul`>good reason to open my mailbox :-)
<zimoun>civodul`: if you have time to give a look at <>. Because if this patch does not make sense, I will turn it into a script for etc/xyz.scm. :-)
<rekado_>zimoun: I read the report. That’s why this issue came back to the front of my mind :)
<rekado_>found in the wild: “The Eigen library is upgraded to version 3.3.7. We have observed that this results in the principal components produced by the pca mode to be ever so slightly different from the previous versions. […] Thus for ongoing analyses it is recommended that you do not upgrade your QTLtools version.”
<civodul>fun :-)
<rekado_>this is something that many users are blissfully unaware of.
<rekado_>hence the exclusive obsession with the version string of their favourite tool.
<rekado_>all other ingredients get a free pass.
<civodul>version strings alone would deserve a talk or blog post or something
<zimoun>civodul: I am writing one, stay tuned. (But I procrastinate to do other irrelvant stuff, as usual :-))
<zimoun>civodul: oh, maybe SWH is changing their way to compute SWHID which maybe means Git compatibility not guarantee. Disarchive-DB would become essential thus.
<rekado_>my colleage needs to install Galaxy on another server, and it’s pretty straight-forward: you copy the code, run the installer, and it goes ahead and uses conda to install all dependencies for all tools/plugins
<rekado_>turns out that this takes 3+ hours.
<rekado_>for 175MB of package data
<rekado_>the problem is not with the download, but solving the environment constraints
<rekado_>I totally forgot about this
<rekado_>living in this world where the dependency graph is fixed is pretty neat
<rekado_>but for conda the problem gets really bad when you want to set up a large environment
<rekado_>this is with packages from conda-forge and bioconda
<rekado_>also interesting: they now go pretty far down, close to the root of the graph
<rekado_>the dependencies include fake packages like sysroot_linux-64, gcc_impl_linux-64, etc
<rekado_>so: fewer assumptions about the target system
<zimoun>rekado_: yeah I remember when the SAT solver had been introduced in Conda. And I remember the project <>.
<zimoun>civodul: as I know you like Julia and this author, do not miss <> ;-)
<civodul>zimoun: thanks for the link!
<civodul>i don't know Lee Phillips tho, do i?
<zimoun>civodul: I do not know if you know them. But you mentioned and pointed to me several of their articles. :-)