IRC channel logs
2024-12-05.log
back to list of logs
<oriansj>aggi: I think it is best if the tcc-linux-musl.iso bootstrap built upon the Fiwix kernel step which seems the logical stopping point for live-bootstrap for your purposes and improvements to fiwix would benefit all bootstraps. <oriansj>aggi: it has about 2TB/month bandwidth cap but can burst up to 100MB/s bandwidth as it is hosted in Linode (if that is acceptable to you and your needs) <oriansj>jackdk: I think the eras of live-bootstrap would probably best be broken at: 1) after mescc-tools-extra 2) after fiwix and 3) after modern linux toolchain and distros probably would best be served using 2/3 <oriansj>homo: It was designed to provide the minimal core of haskell if one wanted to use haskell to improve the bootstrap chain at the M2-Planet level. (other better haskells exist that could be bootstrappped by later compilers) but ultimately no one used it and it was just another blind path of mine. <oriansj>GHC remains a major bootstrapping problem. <homo>well, as a guix user, I cannot build hugs because gcc 4.9 fails to build, meanwhile nhc98 is written in haskell, but built from generated C files <jackdk>as a guix user I get hash mismatches on the distfiles for gnutls, libgcrypt, libgpg-error, and openssl-1.1.1l won't pass tests because all the certs in the test suite have expired <homo>I don't mind programming in mini-version of haskell that is easy to bootstrap, all I want are lazy evaluation and strong typechecking, it's just this mini-version, sadly, won't be able to build existing projects like darcs <mid-kid>oriansj: Nice! I'll try to collect a list of them next time around. <pabs3>ACTION wonders if there has been discussion about bootstrapping VCSen before, for eg downloading the VCS source for pijul requires pijul https://pijul.org/downloads <homo>how do distros package pijul then? <oriansj>homo: same way they package a bunch of *shit*; they just take a binary blob and distribute that to their users <homo>in case with freebsd I see pijul-1.0.0-beta.9.crate <oriansj>we just have the possibility if the bootstrap chains are preserved; for us to figure out a route to complete their chains later and thus solve it. <homo>I guess "*.crate" is some sort of archive format, I'm completely disconnected from rust <fossy>only distributing your source code via the thing i am trying to build is absolutely stupid <fossy>does anyone actually package this thing? <fossy>^ this is true, that is how distributions do it, but i would really prefer if they provided an official .tar.gz of their source code <homo>imagine if all code written in rust was available only through pijul <homo>still, freebsd just downloads .crate for everything <homo>also it's "awesome" that every language has its own package manager that mimics what system distros are already doing <oriansj>homo: well there is a very long tradition of each new cycle to recreate the mistakes of the last cycle. I just hope Guix and Nix packaging provide a long term path out. <fossy>eh, not sure it's fair to say that it mimics what system distros are doing, because language package managers often support different versions, where as your typical distro has one version per package <homo>in guix there are many packages with many versions <homo>still, in distros the preferred practice is that all packages depend on the same version of specific library, at the very least it makes the whole build faster, because only one version of library is built <homo>also I don't recall lts stackage mixing different versions of same libraries <mid-kid>Installing multiple versions of the same library side by side isn't a problem in most distros <mid-kid>the problem is that the library needs to support this <mid-kid>usually by having a different filename for the different version <mid-kid>if the whole language ecosystem supports this, it's not really a problem at all. <homo>don't high-level languages just link everything statically? <mid-kid>it's less a high-level language thing and more a "modern" thing <mid-kid>where people prefer ease of distribution over disk space and compile times <mid-kid>rust links everything statically as does go <mid-kid>this doesn't mean however that a system package manager can't manage static libraries, precompiled or not <homo>well, can you really have dynamic linking in languages with static polymorphism? <mid-kid>debian packages all rust crates it uses and just installs them as source code to be used when building rust packages <mid-kid>the reason swift did it is because apple manages an entire system and values integration and easily being able to upgrade things that even proprietary apps use <mid-kid>whereas rust is only concerned with delivering applications <homo>ACTION can't find word "dispatch" in that page <homo>I assume you mean dynamic dispatch as oppossed to static dispatch in polymorphism <homo>even though go links everything statically, interfaces in go are dynamically dispatched <fossy>is there a specific language you're considering that does cross-library static dispatch? <homo>well, it can be any language with static polymorphism like haskell <homo>the difference between static and dynamic is that in dymanic you can, for example, have a list of Eq: [32, "string", Point {x: 1, y: 8}] <homo>this is how interfaces in go work <homo>but in haskell static polymorphism will reject that <homo>in a way it's implementation detail, you can have compiler for statically polymorphic language that does static dispatch and another compiler for the same language that does dynamic dispatch, the first produces faster code, which makes it comfortable for end-users, but second just compiles faster, which makes it comfortable for developers <homo>generics is static polymorphism <homo>swift does dynamic dispatch <matrix_bridge><googulator> homo: old GHC is easier to target because it's less source code - but even the earliest surviving version of GHC's source code is something like 20MiB already <matrix_bridge><googulator> Modern GHC's code base is so complicated that it's kind of a miracle that it can even build itself <homo>kinda like web browsers and linux kernel <matrix_bridge><googulator> but you don't need to build all of the source code to get a working kernel capable of hosting its own build <matrix_bridge><googulator> most of it is driver code that's never gonna be used on your system, whatever system you might have <matrix_bridge><googulator> as opposed to GHC, where most of the code is for language features that are used in the very code that implements them <matrix_bridge><googulator> contrast that with even GCC - supports C++20 and many other languages, but only needs a C++11 compiler to build <homo>you mean it's difficult to even rebuild libraries that ghc depends on? <matrix_bridge><googulator> Libraries are another can of worms; recent GHCs actually use built-in dependency resolution for their own code <matrix_bridge><googulator> But I'm just referring to the included Haskell source code that implements the language <matrix_bridge><googulator> It's not written in a restricted or even well-defined subset of the language, but instead uses almost every feature <homo>also both C++11 and C++20 are standards and have competing implementations, but that's not case with haskell <matrix_bridge><googulator> & the modern GHC-Haskell dialect is so complex and often poorly documented that there's little chance of anyone writing a compatible reimplementation in a different language, or even in a more restricted Haskell <homo>isn't it nightmare to maintain then? <matrix_bridge><googulator> in fact, I would hazard a guess that for most of GHC's code as it stands today, not even its own authors could independently rewrite in even in full GHC-Haskell while retaining the ability to compile GHC <matrix_bridge><googulator> Another way to put it - it took ~30 years for GHC to be written; maybe it could be independently reimplemented in another 5-10 years of effort, but by then, GHC itself moves on to be even more complex <homo>and I thought rust was bad... <matrix_bridge><googulator> With Rust, you have a specification to help you <matrix_bridge><googulator> & strict assurances to make sure rustc's own code doesn't depend on undocumented rustc-specific extensions <matrix_bridge><googulator> By contrast, the oldest GHC we have (v0.26) is already written in an extended version of Haskell 1.2 that was only ever implemented in GHC <matrix_bridge><googulator> v0.24 was the last version written in a standardized version of Haskell, and the last one to be ever successfully built by a compiler other than a prior GHC - and it's completely lost <homo>are these features are used outside of ghc? <matrix_bridge><googulator> Most notably, the module system would later become part of the standard in Haskell 98 (IIRC) <matrix_bridge><googulator> My hope is that 0.26 is still close enough to 0.24 that it's at least feasible to reverse engineer and reimplement the needed features in another 1.2-compliant compiler that's not written in Haskell - which is where Yale Haskell comes into play, as it's meant to be 1.2- and 1.3-compliant <matrix_bridge><googulator> last version of Yale Haskell we have is from mid-1994, while GHC 0.26 was late 1995 <homo>a lot of old code bases are actually hard to build <homo>so hard it might be easier to just create new compiler <matrix_bridge><googulator> We know that 5.04 is new enough that Guix can build it using 4.08 (IIRC) <homo>another thing I thought was bad about C++ and rust is bloat in the language <homo>guix cannot build gcc 4.9 and therefore cannot build hugs, because it depends on gcc 4.9 <matrix_bridge><googulator> & then on, it's possible to bootstrap up to 7.4 (up to 7.0 is implemented currently in Guix, but it's easy extend to 7.4) <matrix_bridge><googulator> then in 7.6, hell breaks loose, so the next buildable version is 7.10, using a downloaded binary of 7.8 <matrix_bridge><googulator> Hugs is unfortunately useless in and of itself, as it has no support for recursive module dependencies, which GHC's source code has always consistently had since 0.26 <homo>hugs is useful for personal projects <matrix_bridge><googulator> and AFAIK it's not simply unimplemented, but rather a fundamental limitation in Hugs' design <homo>I mean when I program in haskell, I want lazy evaluation, IO monad, static type checking and syntax that is not painful to write <matrix_bridge><googulator> Writing a "simple" Haskell like blynn on microhs is the "easy" part (not that it's actually "easy" or "simple", just in comparison) - the hard part is extending it to where it can compile GHC <matrix_bridge><googulator> Basically if a compiler can understand GHC, it can probably understand every Haskell program that exists <homo>I looked at clean, it looks like haskell without IO monad, but I don't know if it's bootstrapped <homo>I'm not aware of any other functional language that has lazy evaluation <matrix_bridge><googulator> there are only 2 LazyML compilers known, one's written in itself, the other in _cough_ Haskell <matrix_bridge><googulator> Miranda and Clean were both originally proprietary, the free versions were built by proprietary ones that came before them <homo>it must be really hard to implement declarative languages using imperative languages <homo>like in haskell it's not difficult to create languages using parsing combinators <homo>but I never tried creating even math expression evaluator in imperative languages <jackdk>It's really not as bad as you might think. Crafting Interpreters implements some pretty good evaluators in Java and C, which gets you good portable idioms for lexing and parsing <jackdk>I have actually been working on a runtime for lazy purely functional languages in -Wall-Wextra-pedantic-clean C99, but I discovered some mistakes forced me to restart and I haven't had a chance to get back to it for a few months <sam_>[04:15] <matrix_bridge> <googulator> With Rust, you have a specification to help you <sam_>[04:16] <matrix_bridge> <googulator> & strict assurances to make sure rustc's own code doesn't depend on undocumented rustc-specific extensions <aggi>oriansj: i am not at all concerned about fiwix within live-bootstrap and i will not touch any known-good working state <aggi>currently i am testing musl-1.1 and musl-1.2, and thinking about how to shrink the devutils.squashfs package-set down to something reasonable <aggi>so far it seems both musl-1.1 and musl-1.2 are functional against a linux-2.4 ABI (patched for nptl support) <aggi>time64 stat/stat64 handling with musl-1.2 requires carefull review/testing still, but it's good to know musl-1.2 is prepared for year2038 <aggi>i have an idea, which simplifies publication, i'll publish build-scripts only <aggi>it's just i was told a complete distribution driven by tcc would not be feasible nor desireable <aggi>i could upload the gentoo overlay and a bunch of shell scripts, but that's total chaos <aggi>and i wouldn't want to produce any more noise nor confusion <aggi>nonetheless, it is feasible, and all that's remaining is another few iterations of cleanup and scripting <homo>still, building modern ghc by chain-building all ghc versions starting from 0.26 is going to make it extremely hard to adjust source code to build on different OSes and CPU architectures, sounds like it's simply not worth trying to build something that old <homo>adjusting source code is even more tedious when compilation speed is slow, taking my x86_64 laptop into consideration, I expect it would take at least 2 months to compile all these versions of ghc, at this point patching all ghc versions just to make it bootstrappable on different OSes and arches would take years <homo>like who would even try to do that just for riscv alone? <matrix_bridge><googulator> For riscv, the only option would be cross-compiling <matrix_bridge><googulator> But cross-compiling from a bootstrapped x86 version is much better than cross-compiling using an x86 binary of unknown and unverifiable provenance <homo>but that means it's not bootstrappable on riscv, as it's impossible to build ghc by having only riscv hardware and ghc's source code <matrix_bridge><googulator> "Not bootstrappable on riscv" is a lesser issue than "not bootstrappable, period" <matrix_bridge><googulator> In fact, even restoring the ability to go from .hc files (of any version) to a modern GHC would be a step forwared <matrix_bridge><googulator> If the gap from 7.4 to 7.6 can be bridged, we can have the Haskell world be rooted in 4.08's .hc files, rather than 7.6's binary distribution <homo>but aren't .hc files generated from .hs files, which basically makes them the same as binaries? <matrix_bridge><googulator> I'd consider them a step worse than mere pregenerated files, but not quite as evil as a binary of unknown provenance <homo>sounds like it's easier to have alternative compiler and work with developers of existing projects like darcs to have them buildable without ghc <matrix_bridge><googulator> IMO modern Haskell is _way_ too complex to plausibly write an alternative compiler <homo>I mean adjusting source code of existing projects to build under haskell98 or haskell2010 <matrix_bridge><googulator> Those had no integrated dependency management, which the whole modern Haskell world fundamentally depends on <homo>you mean dependency management outside the scope of cabal? <matrix_bridge><googulator> It's like trying to remove Maven/Gradle from the Java world, or Cargo from the Rust world <matrix_bridge><googulator> AFAIK Cabal didn't yet exist in Haskell98, and was far more primitive in 2010 <homo>let me guess, it's a pain to build under different compiler <homo>but aren't both cabal and cargo just tools like make? make is not part of C standard, but is used to build a lot of C projects <matrix_bridge><googulator> Kind of, but the problem is that these tools aren't stable, unlike Make <matrix_bridge><googulator> you can build the latest GNU Make using a fairly old GCC, LLVM, or any other competent C compiler <matrix_bridge><googulator> and you can use not-so-recent Make to build modern C compilers <matrix_bridge><googulator> by contrast, you need the latest cargo to build the latest rustc, and vice versa <homo>the format itself doesn't look hard to parse under different tool <homo>and there it also states "default-language: Haskell2010" <homo>never mind, at the bottom it states which extensions it uses and that's too much <lanodan>And while make isn't part of ISO C, it is part of POSIX and has several implementations (like GNU, the various BSDs maintaining their own spin, pdpmake, illumos dmake, …) <homo>and those implementations are incompatible with each other as they come with their own extensions <homo>even syntax for "if" is different <lanodan>Well that's also true for C and shell, most languages end up with extensions, but you can choose not to use said extensions <homo>anyway, I meant the separation between standard for language, standard for libraries and standard for tools <homo>haskell2010, from what I see, is standard for language and libraries, it doesn't state tools <homo>and it doesn't look hard to create parser for .cabal files to build projects and leverage apt, guix, whatever for complicated dependency management <matrix_bridge><Andrius Štikonas> well, if haskell is so complex, maybe line by line port of modern haskell to another language could work, but that's years work porting... <stikonas>but hard to see any other way how to jump from old GHC to modern one (even if we can get old one, which we can't) <homo>the thing is even for personal programming I choose haskell simply because there is no other language that provides good looking syntax, lazy evaluation, strong type checking and IO monad, and if there is it has same bootstrap problem <homo>it's painful to write code in haxe, lisp, erlang and others <rekado>to those who think it would be easier to write an alternative implementation of Haskell 2010: please look at the existing alternative implementations of Haskell98. If that's not a good enough foundation to write a 2010 implementation then you'll have a hard time closing the gap. And if it is good enough you'd have saved all the time required to support Haskell98. <rekado>the one big advantage of the bootstrap starting from ghc 4 is that it is known to work, because something like it has been done over the decades. <rekado>before ghc4 things become very difficult, because of various non-free compilers that were around at that time. <rekado>note also that GHC has always implemented non-standard Haskell and depends on a compiler supporting these non-standard features. <homo>bootstrapping from ghc4 is still problematic for various os+arch combinations <homo>suppose netbsd on riscv32, how do you bootstrap modern ghc from ghc4? <rekado>I won't, but this is something that could be worked around (have you looked at the GHC4 code?). Implementing a cross-platform replacement for Haskell 2010 is just not terribly realistic, in my opinion. <homo>isn't hugs somewhat cross-platform haskell98? <rekado>I have attempted to use hugs to interpret GHC before, but even with heavy modifications to the GHC sources this was not possible. That's in no small part due to all the real world abstraction-breaking stuff in the GHC RTS. <rekado>in the old days Hugs was able to be used as a compiler, but it required a different RTS. <homo>isn't hugs just interpreter? <homo>haskell report states that difference between haskell98 and haskell2010 is not big: added FFI, hierarchical module names and pattern guards, but removed (n + k) patterns <rekado>GHC is not implemented using standard Haskell. <rekado>yes, hugs is an interpreter, but it was able to interpret the code for a Haskell compiler <rekado>which could thus be used to compile the interpreted compiler <homo>you mean nhc98 on top of hugs? <rekado>it was just one example that I looked at <rekado>I read through a lot of old release notes, and the old GHC release notes would sometimes mention compatibility with Hugs and very old versions would come with a fork of Hugs. <rekado>(there was a part 2 to the blog post, but I never bothered publishing it, because it kind of fizzled out.) <homo>in short it's a chain of compilers compiling each other into vm code <rekado>it has been mentioned here a few years ago <rekado>I stopped following all this a few years ago, because my amount of interest in Haskell began to match the lack of interest of the Haskell community in bootstrapping. <homo>well, I'd like to switch to smaller functional language that is bootstrappable, but I don't know of any that has 4 basic things at the same time <homo>anyway, it's surprising all implementations are abandoned <homo>small vm seems to be promising approach as it reduces the amount of code necessary to support different oses and arches, at least until ffi is hit <fossy>os+arch combinations are a non-issue for me when considering higher level bootstraps, as QEMU and cross-compilation exists ...... <pabs3>homo: there is also a pijul crate on cargo, which is hopefully the same as the vcs data :) <fossy>blynn's compiler still seems to be very far off anywhere near haskell compilation unfortunately <pabs3>mid-kid: crates are often different to the VCS, for eg the rand crate contains Rust code generated by a Python script, which is not in the crate but only in the VCS <homo>pabs3: obviously, as freebsd downloads pijul.crate <homo>advantage of blynn's compiler is writing haskell in a tiny subset of haskell