IRC channel logs

2022-09-30.log

back to list of logs

<oriansj>stikonas: well __DATE__ and __TIME__ wouldn't have been a problem if they would have be set to the modification time of the files (which would have been fixed values in tarballs and source control commits)
<vagrantc>when i do a fresh git checkout, the timestamps are usually recent timestamps, not the last modified time
<vagrantc>and some files get generated during a build ... which will likely have a current timepstamp as well
<oriansj>vagrantc: even if you do stat?
<oriansj>feel free to leverage touch -t 200805101024 foo in your testing
<muurkha>generally source control systems update the mtime when you check out files
<muurkha>if they didn't, make might not rebuild them
<oriansj>muurkha: but if the files didn't change in the checkout why would they need to rebuild them?
<oriansj>I will grant you that they would need to set a proper create date but that is done by the kernel for audit purposes
<oriansj>and access date would have to be updated on every access (unless a flag inhibiting that is set)
<oriansj>but I can't imagine any reason why to update the modification date to be anything different than what was checked in to version control.
<muurkha>oriansj: yeah, that flag is almost always set nowadays
<muurkha>but the reason to to update the modification date to be anything different than what was checked in to version control is scenarios like the following:
<muurkha>t=1: modify foo.c; t=2: compile foo.c; t=3: check in foo.c; t=4: modify foo.h; t=5: coworker modifies foo.c; t=6: coworker checks in foo.c; t=7: compile foo.c (because of new foo.h); t=8: check out foo.c (to prepare for checking in foo.h); t=9: compile
<muurkha>before this last step, foo.o's mtime is 7, the modification time of foo.c in source control is 6, but foo.o does not yet incorporate the changes from this other foo.c
<muurkha>so if at t=8 the source control system sets foo.c's mtime to 6 instead of 8, you are in a world of hurt, maybe checking in a broken tree that doesn't even build
<muurkha>a simpler but less common scenario is the following:
<muurkha>t=1: check in modified foo.c; t=2: check in modified foo.c; t=3: compile foo.c, noticing a bug; t=4: check out the version of foo.c from time 1; t=5: compile
<muurkha>if at t=4 the source control system sets foo.c's mtime to 1 instead of 4, the compile at time 5 will be a no-op
<muurkha>~
<muurkha>at least in that case you would probably notice and do a make clean
<muurkha>but it would still be an annoyance
<muurkha>build systems that use SHA-2 sums or something instead of mtimes avoid this problem, but
<muurkha>does that make sense?
<oriansj>muurkha: but the create date being updated by the version control on checkout would actually logically map to what the change is. (checking out a new file) and build tools can simply use the largest of the two values to determine if a build was needed.
<oriansj>but I can grant you that odds are that most version control systems don't even bother to set the modification date at all and just are doing an fopen and the kernel will be default set access/modify/create all to the exact same value.
<muurkha>what do you mean by "the largest of the two values" and "would logically map to what the change is"? sorry, I'm stupid tonight
<muurkha>oh, the largest of mtime and ctime? hmm, could be
<muurkha>although that will be pretty unpleasant if the mtime is set several minutes or hours in the future because your coworker's machine is suffering clock skew
<muurkha>it's also the case that utime() and utimes() don't offer a way to change the ctime
<oriansj>muurkha: true but a couple hours in the future would be a definite sign of a serious configuration problem. (probably completely broken ntp setup)
<oriansj>muurkha: well removing the old file and creating a new one would certainly do that and then you just dump the calculated contents
<muurkha>similarly for utimensat() and futimens()
<oriansj>or creating a new tempfile with the new contents and then replacing the old file with the tempfile (which would be more atomic of a change and reduce the risk of a partial checkout)
<muurkha>yeah
<muurkha>I haven't looked to see how git checks files out but I would be surprised if that isn't what it does
<oriansj>hence why the create date would map best to when the file was checked out (not the modify date, which should map to when *you* modified it last)
<oriansj>but I also grant you some build systems might just look at the modify date and nothing else and fail to build if only the create date was changed on checkout.
<muurkha>well, it's true that if you did it that way, and your build system took max(st.st_mtime, st.st_ctime), then it would work fine
<muurkha>I don't know what GNU make looks at
<oriansj>well looking at make v3.82 (from live-bootstrap) in the file filedef.h the definition of struct file only has MTIME but no CTIME or ATIME
<oriansj>"struct file"
<oriansj>so logically, they are only looking at MTIME and nothing else
<oriansj>but I do know tar for exmaple will set the mtime to match the mtime of the file in the tarball when extracting
<oriansj>so your example using version control would also occur if 2 people were sending each other tarballs in separate directories as make isn't looking at the ctime (which would be later than the mtime)
<oriansj>but that is also why my makefiles include all of the source code needed in the build (including the .h) so that the modification of the .h would have forced the rebuild of the .o for the .c file
<muurkha>yeah, tar does do that (because it's designed for making backups)
<muurkha>and it is indeed a potential problem if you untar a tarball on top of files you've been workign on
<muurkha>but usually you try not to do that for other reasons
<muurkha>do you know about gcc -D btw?
<muurkha>uh
<muurkha>I mean gcc -MM
<muurkha>or gcc -M in general, though sometimes that's unwieldy
<oriansj>I knew about gcc -D (and implemented -D in M2-Planet as well) but gcc -M is new and interesting to me
<oriansj>as is gcc -MM
<muurkha>I misremembered -M as -D for some reason, but obviously -D is #define
<muurkha>there's a section in the GNU Make manual about how to use -MM to automatically generate dependency files that will automatically be included into your makefile
<oriansj>I can see the logic and usefulness of the combo
<oriansj>and if I wanted to speed up my builds, I probably could leverage that going forward.
<oriansj>I wonder if there is a similar trick for Java
<muurkha>I think the Java compiler is supposed to compile only the things that need to be recompiled
<muurkha>but for whatever reason people have written a bunch of build systems for it
<oriansj>muurkha: well I know people who do insane things with Java builds. Like recursively depend on previous binary releases to be able to build the new release
<muurkha>that's often convenient for development
<oriansj>muurkha: nothing about 14 hour long builds sounds convenient to me (but then again I was the one trying to convince them to convert to 2 minute non-recursive builds)
<muurkha>were they frequently waiting 14 hours for their builds?
<oriansj>yep
<oriansj>and still do til this day
<muurkha>huh, that does sound inconvneinet
<muurkha>but using an existing compiled artifact to compile the current version is often pretty convenient
<oriansj>muurkha: well they are also contractors paid by the hour so consider the possible reasons why they wanted to retain control of the build process and not change anything
<muurkha>heh
<oriansj>greetings one
<one>hahaha, yello!
<oriansj>and yes you can ask your questions here, you will likely find live-bootstrap https://github.com/fosslinux/live-bootstrap a place where you can learn a great deal about what makes a modern distro work and why
<one>I found guix bc I'm a liberty freak, and found my way into pc's via crypto currency(history of economics/monetary background). GNU was the best learning without a doubt. But the way I learn, i decided I needed to try shit to get a real lay of the land, like i would investing...
<oriansj>stage0-posix https://github.com/oriansj/stage0-posix literally starts with the assumption of only a kernel and builds up linkers, assemblers, compilers, a shell and is the starting place for live-bootstrap so you can quickly learn about all the major pieces of a modern system and how they fit together.
<one>"what are my actual tradeoffs question, led me from Linux to GNU to BSD to Haiku-OS etc...
<oriansj>oh and yes it is the root of trust for all cryptocurrencies that don't want to be vulnerable to the Trusting Trust attack
<one>"Simpler" systems in old ones, helped demistify why things are as they are for me personally
<one>But, i found i love computing, even fascist like unopen powershell is fun...
<one>But, i really started thinking aobut guix on BSD, and haiku... windows would be cool... and it looks like some folks are on that already I guess..
<oriansj>actually powershell is under the MIT license these days: https://github.com/powershell/powershell and why distros like Ubuntu have it packaged.
<one>But was curious with haiku, which is really a single user pc by design, but does have a multiuser work around on github... it looked like there needed to be a getent
<one>yeah,you are right... LOL. Still kinda feels like an ex police officer opening up a pot shop
<one>But, I did rationalize it's use on the open source basis. haha
<one>chocolatey is a lot of fun too
<one>But, haiku-os is also very unique, kinda curious on thoughts about it being a single user/root user platform etc...
<one>uses pkgman
<oriansj>well Microsoft is like a shark; if replacing the NT kernel with Linux would make more profit, they would do it.
<one>for sure
<one>I don't hate profit motive as much as I hate the use of really dumb incentives
<oriansj>Well Haiku like the BeOS it was based on was a good idea but it came too late and without a willingness to bring a baseball bat to the fight.
<one>It's hard to win out against the Bank of international settlement. They literally got shelved bc asia 96 led into dot com plus a hangover. It's amazingly complete, but given its a ground up build, makes sense.
<oriansj>the Be Filesystem is probably the most innovative bit of the user experience. Some people in the Data curation space even have tools to replicate that functionality on Windows and Linux these days (if a good bit slower)
<one>I just really like it from a dev standpoint. I don't know what percentage, but it has like 13k packages, all of which(damn near all) are useful, especially to mom and pop stature individuals and business. Feels like people need more choice. And for sirst time, it might be possible to skirt money printing like GNU has with only 3% of funds being corporate. Also, very much like linux in being less interesting to bad hats
<oriansj>one: well any problem space where peope can engage in non-zero sum exchanges; money isn't required to encourage cooperative behavior. As everyone would recieve a multiple of the effort they invested on average.
<one>I really wish I was bright enough to bring the BE kernel or a reverse engineered prototype. SO much that could be fulfilled there. haiku bootloader is also interesting....
<oriansj>one: well if you want a strong education on how a kernel can work builder-hex0 might be of interest: https://github.com/ironmeld/builder-hex0
<oriansj>a POSIX kernel in less than 2K of assembly
<one>oriansj: Because I'm insane, my goal is actually to become a super user before I'm dead. So I'll definitely be checking this all out. Started code and realized I could be a monkey pushing buttons building for systems i didn't understand, or I could take the long route to be closer to a computer scientist than a well paid cog
<one>And, i don't mean to marginalize bootcamps, just that a lot of new talent doesn't necessarily get the whole picture
<one>I know, bc a business person, I was always translating between tech and biz nerds and learning on the fly... its hard to get everyone rowing in the same direction, but i'm hoping that's what i'll be able to do...
<oriansj>one: to fully understand how computers and software work will not take someone who cares to learn very long.
<oriansj>but that doesn't mean you will know all software as it tends to be a reflection of the values of the people writing it.
<one>right. Just figuring out what within the space that's one's priority, let alone the tools that get you there as envisioned. I mean, I thought I was a crypto guy, until it dawned on me i could work on decentralized and freedom enhancing things in all kinds of spheres... Crypto is cool, but it's also SO narrow.
<oriansj>one: well starting from the basics usually is a good idea. Then you can bootstrap your understanding upon universal truths that don't shift like sand under your feet.
<one>That was my thought.... LOL. Why I gravitated to GUIX... I guess I could have gone full mental in the interest of short term pay and decided to learn COBOL hahahahaha
<one>JK
<oriansj>one: learning COBOL is fine if that actually interests you
<oriansj>but honestly what is your goal, or hope to achieve once you become that super user.
<one>Does a $90k salary count as an interest? I mostly kid, and jokes are half truths. I said it bc being a money nerd, I did think about it. :D
<oriansj>one: yes it is but if your goal is making money $90K/year would be on the low end if you are willing to do what is required to learn skills people need to solve their problems.
<one>I mean, I think we are in an era of change and fascism. I hope that I cn get up to speed to contribute to projects I like, to build my own ventures, and to help re-frame what choice(or tradeoffs) looks like. If I hit the lottery, it would be as a programmer for an open or free source project. One that thinks big but local
<oriansj>one: so you wish for a meaningful life of purpose and connection to others where you will feel welcomed and accepted for who you are.
<one>Nobody has as much knowledge about global trade, emergin markets, the history that forged those environments, or about monetary policy both historical & current. I intend to capitalize on all of that, by learning the vocab and tools/processes to begin actually building solutions. That work for everyone, and not just work for those that can write policy or contribute donations. I guess I want to take "intranets" style community to global scope.
<oriansj>one: what about that is the thing you most want to share and create?
<one>a few things seem paramount. Global thinking isn't accompanied by planning that treats users locally. I tend to think about whole cities(think city states) utilizing decentralized tech to fulfill tasks that could only reach price equilibrium through mass use previously. The fact that google build a server center in Henderson NV for example. In LV, we have huge industry and plenty of locals that could fulfill their needs by themselves. We
<one>could easily have 1 million or more with servelets, desktop servers etc... that could build all their local services around those nodes...
<oriansj>and I assume you considered the EROEI problem right?
<ilmu[m]>err one I am sorry to toot my own horn but please check out [datalisp.is](https://matrix.to/#/#datalisp:matrix.org)
<muurkha>ilmu[m]: what's datalisp.is?
<ilmu[m]>it's what I believe is a sensible way forward
<muurkha>what do you believe is a sensible way forward?
<ilmu[m]>there's a bunch of things that care about authenticated datastructures ... reproducible builds, version control, cryptocurrencies, ... and a good way to build authenticated datastructures is to have a canonical encoding
<ilmu[m]>so basically the first step would be to find a good general purpose encoding for building these systems
<ilmu[m]>thankfully that problem is long since solved: canonical S-expressions
<ilmu[m]>so okay, done, we have an encoding for the data interchange happening on the internet. Now what we need is a way to manage the contexts for interpreting said data. That is; a name system that lets you refer to a context in a way that can receive updates.
<ilmu[m]>Okay but updates you say, that is not an easy problem, how can we do them in a decentralized manner?
<ilmu[m]>bam we've reached the infinite dollar question that spawned democracy and all that jazz
<one>I guess the simplest way to frame change that should occur according to actual dynamics and choice, just relates to thinking based on flagging business practices that aren't requisite. Not to mention printed money and cost to maintain large infrastructure.
<one>I don't know, ican go buy a soda at a soda machine...
<ilmu[m]>I would amend your phrasing; you can access soda from a soda machine
<ilmu[m]>it's actually access control this money thing
<ilmu[m]>muurkha: thanks for the prompt :P
<one>hash sigs, qr codes etc, solar photovolic cells, use of blockchains
<one>all centralized money is access control.
<one>but amazingly, even that could be simplified. You could automate barter just a a really lame brain example of a different approcah
<oriansj>well a social set of rules which effectively provide access control and punishment for those who violate those controls
<one>I'm not arguing in favor, just pointing out that the technological if well conceived can act as the bridge between the actual process, and that felt by a user
<one>China... and the USA, and anyone else with legislation meant to nudge..
<one>Scope is the crux
<muurkha>ilmu[m]: hmm, it's an alternative to JSON coupled with a namespacing system for data dictionaries?
<ilmu[m]>meh
<one>A monopoly makes sense if scope allows(trains competing over single set of rails isn't cheaper)
<one>So does decentralization.
<ilmu[m]>I think a lot of us agree on the platitudes, it's the concrete implementation that we struggle iwth.
<ilmu[m]>usually the easiest thing to agree on is the simplest thing
<one>for sure, but that's also a scope thing
<one>or the local thing where all participants are only subject to those they can reach
<oriansj>scaling laws exist and bring hard questions about when is it better to cap sizes of industries
<one>If a solution mus be everywhere at once, its probably not a solution
<ilmu[m]>json has the same problem as C strings; escaping. Prefix length strings are not deficient in the same way. Therefore we will eventually use them. S-expressions are well established and anything else would be equivalent anyway so let's just keep it simple.
<one>scaling and scope aren't exactly the same
<ilmu[m]>building a distributed system requires you to think about how you will do communication between the parts.
<ilmu[m]>therefore fixing some standard encoding is important.
<ilmu[m]>and yeah the namespacing and data dictionaries are basically on point
<ilmu[m]>but really the key is that we need to finish packaging software if we want to have any hope of regular people using it.
<ilmu[m]>that means we aren't done just by building it, we also need to index the ways we can invoke it
<ilmu[m]>that way regular people can access it
<oriansj>ilmu[m]: if one assumes a single program/protocol rather than a series of competing implementations upon the same good idea (Nix package definitions vs Guix package definitions as an example)
<ilmu[m]>then we can start doing some probability theories and economics around updating the definitions.
<ilmu[m]>oriansj: I don't see why we'd actively want to seek out the tragedy of the commons..
<muurkha>I think JSON is better established at this point than S-expressions, and it has better chances of avoiding clashes between independent format extensions
<ilmu[m]>json is deficient.
<muurkha>sure, everything is
<ilmu[m]>no.
<ilmu[m]>not everything.
<oriansj>novelty first search the space will probably result in better outcomes but I can't be certain.
<one>translations and efficacy can make an output standard even if the the inputs aren't. Depending upon the need, which language? For what industry? With what geographical requisites?
<ilmu[m]>the math says: prefix strings are adequate. we can't do better.
<muurkha>math doesn't say "better" or "worse"; that's a question of values
<ilmu[m]>oriansj: yeah exactly, so we're done searching, now we need to rework the foundations and use the stable solutions that will also be just as good in a thousand years.
<ilmu[m]>pythagoras theorem is still true today
<muurkha>the tragedy of the commons is a conflict over scarce resources; free software doesn't need any significant resources
<oriansj>ilmu[m]: only under a set of assumptions that are not always true.
<ilmu[m]>muurkha: no it's mismatched incentives in the local vs global scopes
<ilmu[m]>oriansj: sure but the set of assumptions we have with computers as they are today is that we need to serialize data to send it somewhere
<muurkha>you mean, individual interests that conflict with collective interests?
<ilmu[m]>yes
<oriansj>ilmu[m]: or just send the data in its current form, no need to serialize at all
<ilmu[m]>could you expand on that a bit
<muurkha>I think the tragedy of the commons is generally understood to be something much more specific: an argument to privatize scarce resources to prevent them from being oversubscribed
<oriansj>ilmu[m]: if I send you an sqlite database file, I don't serialize anything.
<ilmu[m]>muurkha: sure that is a common way to explain it
<one>we need to serialize data based on less geographically independent data(global massive data centers) w/ more grouped data. The point of most of what I was saying, is about approaches being probably the biggest part, more so than available code that works
<ilmu[m]>oriansj: yes you do, the database is a file, that "file" is a stream... serialized
<one>I'm talking the extent of serialization
<one>not whether it's required sum total
<oriansj>ilmu[m]: you can download that file from not a single stream but hundreds or thousands in parallel based on only the length and page sizes
<muurkha>it's sort of true that oversubscribing public scarce resources is a conflict between individual and group interests, but actually the individuals in Hardin's parable aren't getting their needs met either. but it's definitely much more specific than tradeoffs between the needs of the many and the needs of the few
<ilmu[m]>oriansj: okay but we are gradually missing the point here
<muurkha>WP: "In economic science, the tragedy of the commons is a situation in which individual users, who have open access to a resource unhampered by shared social structures or formal rules that govern access and use,[1][2] act independently according to their own self-interest and, contrary to the common good of all users, cause depletion of the resource through their uncoordinated action."
<ilmu[m]>ilmu[m]: the point I want to make is that once you receive this data (which of course is serialized, no matter if you share it via torrent or whatever, you will reconstruct it to be some file or some bits on your hard drive) then you need to understand that it is a sqlite file, that is what a name system is for.
<one>GEOGRAPHY. What economic is exactly the same based on geography alone? Even businesses in the same region with the same model could skin the cat differently
<muurkha>usually people use the term "serialization" to mean the transformation from a useful data structure to a flat series of bytes. in oriansj's sqlite story the serialization happened at an earlier stage of the story: when he inserted or updated rows in the database, not when he sends you the file
<ilmu[m]>one: I feel what you are pointing out is the need for a namesystem to be decentralized (i.e. I don't need permission from some central authority to decide how I skin the cat)
<oriansj>ilmu[m]: well that is the type inference problem and name systems can't solve that because of the bad actor problem and the bob likes the .xlxx file extension problem
<muurkha>WP: "In computing, serialization (US and Oxford spelling) or serialisation (UK spelling) is the process of translating a data structure or object state into a format that can be stored (for example, in a file or memory data buffer) or transmitted (for example, over a computer network) and reconstructed later (possibly in a different computer environment)."
<one>But useful data comes from all kinds of single entries that get scooped up and formulated into many things.
<ilmu[m]>> <@one:libera.chat> GEOGRAPHY. What economic is exactly the same based on geography alone? Even businesses in the same region with the same model could skin the cat differently
<ilmu[m]> * I feel what you are pointing out is the need for a namesystem to be decentralized (i.e. I don't need permission from some central authority to decide how I **name my method for** skinning the cat)
<ilmu[m]>oriansj: sorry but why such a strong result? **can't** really?
<ilmu[m]>so the bad actor problem is being reasoned about with these byzantine fault tolerant things in the crypto world but it boils down to tragedy of the commons (i.e. you'd want everyone to have local incentives that are compatible with the global incentives ... preservation of planet, survival, etc.)
<one>I think it's as simple as in C How to Program (fourth Edition). Why reinvent the wheel? Open source is innately that..
<ilmu[m]>Hmm, well anyway, I will be around. I need to go sleep now since I need to wake up in a few hours. I've written some incoherent things on how we can chip away at some of these problems that we just started discussing here, and I'd be happy to discuss these things more.
<ilmu[m]>...but it'll be this weekend :)
<one>Well yeah, just look at how tribal crypto(sadly distros too) is. Nobody wants to row together and want's to be the next bitcoin, which is shit mind you. Why not just start tracking all your activity and just mailing it directly to Utah?
<oriansj>ilmu[m]: the reason why is if you imagine a society which assumes all people are greedy and sets up perfect rules. The system will break when a few people are generous
<one>if locals work together using tech tools to bridge information gaps(what made industrial revolution and city life real, was be close to info AKA the city)
<one>There is actually no need for atiquated practices built around tradeoffs we no longer have to make.
<one>Frederic Bastiat could live his same life now, and be trendy. You can actually live on a farm and be up to date on damn near everything
<oriansj>one: assuming the chesterton fence does no longer serve a purpose...
<one>I used it to learn computers...
<oriansj>well that is the greatest strength of the internet and the source of its greatest danger. It will show you exactly what you want to see.
<one>I'm definitely much more informed than I was before that. And I have already be able to use a handful of give away pc's to literally roll through all GNU, LINUX, BSD, Haiku etc... And I was raised a hockey playing jock that no educator ever though to show a computer...
<one>I mean, I literally have thousands of dollars in functioning pc if not in price
<oriansj>well the price of one's computer doesn't make it good or bad. But the software and the user's mastery of it will make it impressive or sad.
<one>Is being able to run windows, haiku os, solaris, and linux/gnu on a system a sign of knowing anything, or just dumb luck?
<one>Kinda kidding, but I literally have been trying to fix and blowup my pc's so i can fix them again...
<oriansj>one: it is a sign that you were curious enough to play around and get comfortable.
<one>Or, i just didn't get dissuaded with every failure
<one>LOL
<oriansj>if something is fun for someone, it doesn't matter how hard it is. They will probably find a way to get really good at it
<one>That's what i kept telling myself...
<oriansj>well, give yourself time to absorb and ask and share and you might find what you are looking for and maybe even find something even better along the way. ^_^
<one>I think you hit the nail on the head. For example, I only found guix after getting cornered on a GhostBSD build(which is almost always like plug and go for BSD). So I was annoyed, but it led me into guix/guile and lisp. Pretty good screw up long run.
<one>FEELS like the right approach. But IS SLOWWW
<one>I gotta get focused and do linuxfoundation stuff so, I have to get out of here before I get too sidetracked. But thanks for all the talk from everyone. Hope you all are doing great and feeling great! Peace!
<one>I meant ot really say this, but it had escaped me.... when a lot of "old" tech was sold as outdated, it was against the wrong scope and approach. My major thought, is what percentage of what was abandoned was done so erroneously bc of bad thinking
<one>even atarish MiNT servers using eifel have amazing promise if rein-visioned for the right project.
<muurkha>atari MiNT servers using Eiffel?
<ilmu[m]><oriansj> "ilmu: the reason why is if you..." <- I don't know how you managed to prove that but even if it was unconditionally true then it would still be a pretty odd thing to say about what I am doing. My whole premise is that each individual should be allowed to choose who and what to trust (basically FOSS: run whatever code you want) all I want is to make tools that simplify automatic updates and accessibility of software for the
<ilmu[m]>"normal people" - if it is so bad to base things on what we know about math and economics then I am afraid the powers that be have eroded trust to an absurd extent.
<oriansj>ilmu[m]: well as long as you allow them to override and choose badly if they wanted, then yes your idea can be quite useful.
<fossy>stikonas: i'm pretty happy with autogen pr now. merged - cant find any further repro issues
<stikonas>fossy: ok good, the only slightly messy thing is dynamic linking now
<stikonas>but I'll move musl .so build after autogen for now
<stikonas>and later after newer GCC
<stikonas>but maybe after you merge your changes...
<stikonas>otherwise there will be too many conflicts (and there are also 2 work in progress PRs by doras)
<fossy>yea
<fossy>wait, can we do += in M2-Planet now?
<fossy>perhaps i just misremembered...
<stikonas[m]>fossy: yes all operator= compont operators are now implemented
<stikonas[m]>Maybe not in the most efficient way but it works
<stikonas[m]>(I think they have their own copy of assembly output rather than reusing it from +, - etc
<stikonas[m]>s/compont/compound/
<fossy>i see, makes sense
<stikonas>fossy: though obviously we can't use them in M2-Planet itself
<stikonas>as it has to be buildable by cc_*
<one>muurkha, there are neat mini server setups that use eifel and busybee pc's.... they are very low key and small projects.. but, everything is like that until it catches on. I think the group that was still active was in southern france..
<one>Looks like a lot of it has transitioned into EmuTOS
<one>really, modular, intranet is being reborn via integration
<one>Without corporate oversight and top down management. that's my take.
<one>It's important to note strauss-how archetypes, credit cycles, and "human century": about 10 yrs longer than the lifespan. The constatnt erasure of collective memories, and the epigenetic trauma of present regardless.
<one>-"of" +"that's"
<muurkha>one: I promised not to debate the nature of truth anymore in here
<oriansj>one: it tends to lead to non-productive discussions
<oriansj>but technical discussions and learning about how things work and why certainly is
<stikonas>oriansj: speaking of technical discussions, I'm now working on position independent cc_amd64
<stikonas>I probably won't upstream it back to stage0-posix but it's still necessary as intermediate step in porting to UEFI
<stikonas>it's partially working but segfaults on more complicated stuff
<stikonas>(I think test005 now from M2-Planet test suite)
<oriansj>oh, it isn't expected to work on any M2-Planet tests
<oriansj>just M2-Planet using the bootstrap C code
<stikonas>yes, but at least with stage0-posix version I get useful error message
<stikonas>with cc_amd64 it's segfaulting
<stikonas>(though some simple stuff compiles)
<stikonas>I might have missed some absolute addressing
<oriansj>did you get the debug function operational?
<stikonas>or maybe introduced a bug
<stikonas>debug function should work
<stikonas>it's probably the type stuff that is partially broken
<oriansj>no, it is the first function you need to test in cc_*
<stikonas>cause that was the trickiest test
<oriansj>as it will enable to to figure out what is going wrong
<stikonas>ok, I'll enable it
<oriansj>then you can do tiny tests and know what is causing problems
<stikonas>might be more than one thing
<stikonas>I already found one missed "mov register, label" instruction with gdb (after which more stuff started working)