IRC channel logs

2022-12-05.log

back to list of logs

<rekado>the process to get there may be fairly well known, but hardware that’s available to individuals is not capable of producing the same results on comparable time scales.
<rekado>this goes way beyond the individual’s freedom to source code
<rekado>I think that a better lens to view this problem is one that focuses on commons — both in the legal sense for what ends up being training data and for communal compute resources
<rekado>the only popular model with ‘source available’ license that I know of is Stable Diffusion: https://github.com/CompVis/stable-diffusion
<rekado>it comes with a number of ethical usage restrictions, making it non-free.
<zimoun>hi!
<zimoun>rekado: IMHO, there is no issue about “freedom user” with pre-trained AI model. The weights are just data. The way you obtain these data does not matter; similarly as astronomical data, genomic data, etc.
<zimoun>all the question is about trust.
<PurpleSym>Where can I send my Guix-HPC blog post on guix-cran for formal review? guix-science@?
<zimoun>PurpleSym: do you have commit access for the Guix-HPC website?
<PurpleSym>zimoun: I don’t think so.
<zimoun>PurpleSym: so maybe you can send the draft to guix-science, and I can put it under drafts/ for you and publish once ready. WDYT?
<zimoun>civodul: do not miss patch#59836 (for the release and for the tutorial next week :-))
<civodul>zimoun: awesome, thanks!
<zimoun>civodul: thanks, that was fast :-)
<PurpleSym>zimoun: Sure, I’ll prepare a patch and send it to guix-science@ tomorrow.
<rekado>I just learned of GPT-J: https://github.com/kingoflolz/mesh-transformer-jax/#gpt-j-6b
<civodul>uh