IRC channel logs
2026-03-20.log
back to list of logs
<stikonas>or even more interesting case (perhaps for the future), can LLMs do reverse engineering of binary blobs, e.g. for early bootloaders, firmware, drivers, etc... If they can write a spec and then somebody does read the spec and manually reimplement it <ekaitz>stikonas: we should start doing that so they forbid the AI <ekaitz>when you use it to "steal" things from the corpos they'll regulate fast <stikonas>well, depeends how rich they are I guess <stikonas>used to be that Hollywood and other publishers can regulate copyright law <stikonas>now AI companies basically weakened copyright <matrix_bridge><Lance Vick> fossy: I am not even sure the "co-authored by" even makes sense as that is just going to invite flame wars and feels akin to "co-authored by autocomplete in my vim editor". The bottom line is everyone needs to read and take full responsibility for their contributions regardless of how they authored them. <ekaitz>I agree with that, in Guix we had a similar discussion and I basically said AI doesn't bring new problems, but augment the ones we already had <ekaitz>if someone copied the code from another codebase, how should we act? <ekaitz>AI is basically that, but with more steps <ekaitz>the copyright issue we already had before <ekaitz>the main problem is we supposed good faith in the contributor <ekaitz>but now, that's more tricky because the contributor may not know if the code is being stolen from somewhere else <ekaitz>which again is the contributors responsibility <matrix_bridge><Lance Vick> People copy and tweak functions from stackoverflow all the time, often assisted with fancy "pre-aai" autocomplete tools that look up functions for common needs on the fly to avoid including third party libs for one function. Modern models also generally do not produce more than a line or two identical to source material, so I would say the stack overflow case is actually worse now, but that has always... <matrix_bridge><Lance Vick> So yeah, 100% it has to be each contributors responsibility <matrix_bridge><Lance Vick> If a reviewer merges AI slop or broken code, it is also partly on them too. <matrix_bridge><Lance Vick> But if the code is lean, does the job, and no one can tell if it is AI or not, and everyone can understand it, and to the best of anyones knowledge it is not copied from any other projects, then ship it. <matrix_bridge><Lance Vick> For once I find myself siding with torvalds on something. <deesix>The amorality of the reasoning is really irking. So if nobody can tell if I stole, force, kill, etc... it's OK. Right... <stikonas>it's a bit different though. It was always possible for other people to study your source code, learn from it. And perhaps memorise some small chunks that will get reproduced <stikonas>I'm not saying that's all good now. Probably AI will make lots of things worse too <stikonas>it used to be possible to just have your free system and write code for free <stikonas>and it might be that in the future that will be basically impossible (very slow and inefficient compared to somebody else with subscription) <stikonas>or maybe there will be some libre AI models... <stikonas>though it's hard to see how they can compete with commercial models <ekaitz>deesix: as I said in Guix mailing list, I do have opinions on drug use, but some of my friends use them and code when they are under the effects of drugs <ekaitz>deesix: they say they write better code sometimes, and others they don't understand later what they wrote <ekaitz>I do have a stance on drugs but should I go to those that use them and tell them if I like it or not? <ekaitz>because, let me remind you, drug traffic does kill people <deesix>Am I telling anyone to do this or that? Read just what I wrote. <ekaitz>deesix: i understood what you wrote. You were complaining to us because we didn't put the morality on the discussion <ekaitz>now the thing is, why in this case? <ekaitz>"if nobody can tell if I stole, force, kill, etc... it's OK. Right..." -> this is basically what happens with everything in life <deesix>I'm not complaining to "you". I'm talking about the reasoning. <ekaitz>what I'm trying to say here is this is the only reasoning one could have <ekaitz>each person is responsible of their actions and free to commit them <ekaitz>and I'm saying we are compelled to add the "moral" aspect to things only when it is interesting to us <ekaitz>and this case with AI is triggering, but others might be too <ekaitz>I do have a moral stance on AI, and probably Lance has one too, but at least I decided to leave that away for a minute <ekaitz>now the question is, if someone without using AI steals code and contributes it, is that a different case than the AI case? <ekaitz>-> if we don't find out, it's ok! <ekaitz>we just said, that case already existed before <ekaitz>(replace steal code by waste energy, water, push goverment agenda... the whole thing)