IRC channel logs

2013-06-28.log

back to list of logs

***fangism-shadow is now known as clangism
***clangism is now known as fangism-shadow
***fangism-shadow is now known as fangism-hungry
<davexunit>I just wrote a sprite batch renderer using figl. :)
<davexunit>it would perform better if guile did floating point math in a more efficient manner.
<civodul>Hello Guilers!
<janneke>Hi civodul
<wingo>moin
<janneke>morning!
*wingo simplifying guile's intermediate language in preparation for writing a compiler to rtl
<wingo>morning ceyusa :)
<ceyusa>hello wingo
<ceyusa>I saw the guile-oauth tweet
<ceyusa>nice
<wingo>yeah neat stuff!
<wingo>its author lives in california, so he's probably asleep :)
<ceyusa>hehehe
<civodul>wingo: yeah for the compiler!
<wingo>civodul: yeah!
<wingo>i have an ANF pass
<wingo>and it makes me thing we should do CPS
<wingo>because as Kennedy noted in the great Compiling with Continuations, Continued paper
<wingo>if expressions introduce join points, which are simply continuations (!)
<wingo>so (let ((x (if a b c))) (* 2 x))
<wingo>translates to (let ((j (lambda (x) (* 2 x)))) (if a (j b) (j c)))
<wingo>which, it seems that while we're doing that we might as well do the full cps and get the benefit of named control points
<civodul>hmm ok
<wingo>anyway i'll poke at both
<civodul>i guess i need to read the paper first ;-)
<wingo>it's a great paper :)
<wingo>you should steal work time to read it :)
<civodul>heh :-)
<wingo>or at the very least steal work's printer :)
<civodul>yeah, that's easier than stealing work time these days ;-)
<add^_>wingo: where can I find this "Compiling with Continuations, Continued paper"?
<DerGuteMoritz>Compiling with Continuations: The Revenge
***fangism-hungry is now known as fangism-shadow
<fbs>as book on amazon i think
<add^_>And also, when I think of continuations, I remember you talking about it being "bad" compared to delimited continuations, maybe I've misunderstood that.
<add^_>A book? I thought he said it was a paper..
<add^_>Hm
<fbs>hmm
<fbs> http://research.microsoft.com/~akenn/sml/CompilingWithContinuationsContinued.pdf
<fbs>maybe that
<add^_>Dunno
<add^_>I saw that but I'm unsure.
<add^_>Maybe it is that..
<fbs>well thats the only paper wiht that name
<add^_>Yeah
<fbs>the book is 'compiling with cont'
*ijp fondles his copy
<add^_>Actually, it's the same last-name of the author so it's probably that paper
<add^_>:-)
<fbs>sup ijp , add^_
<ijp>nothing much, waiting for a backup to finish
<fbs>:)
<add^_>Not much here either
<add^_>You?
<fbs>looking for a fpga project i can do
<fbs>:>
<DerGuteMoritz>a lisp machine!
<add^_>fbs: https://en.wikipedia.org/wiki/Field-programmable_gate_array ?
<fbs>ye
<fbs>looking at some cheap 2" color lcds
<fbs>i got one, but with 16bit interface
<fbs>dont have enough wires to hook it up :x
<add^_>:-/
<fbs>driving lcds is also easier with a processor
<taylanub>I wonder, would it be feasible to make the optimizer look into records, vectors, etc. ? Especially if we had immutable versions of those I guess. This makes me realize there is some kind of abstract relation between immutability and syntax ...
<taylanub>Or maybe I just think that because I was planning to abuse syntax for a kind of optimization that would be more naturally done (and automatically by the compiler/optimizer) via constants/immutability.
<taylanub>Hrm .. if an immutable object is also initialized with a compile-time constant, it exists fully during compilation already, and is in this sense similar to syntax.
<ijp>wat?
<fbs> http://t1.gstatic.com/images?q=tbn:ANd9GcRdtGY1sym8j6HrN6aoDwIRfQlRDliyAdHi2S9KIZIR5WEi9FIbFQ
<taylanub>Haha. :D
<taylanub>ijp: The whole expression (let ((x 5)) (+ x 3)) is, in some crazy abstract sense of the word, merely "syntax" for the value 8.
<ijp>a wrong sense of the word
<taylanub>You know I was planning to make a syntactic version of my bytestructures module such that the evaluation of indices (i.e. calculation of offsets) in the ref/set operations (which are syntax) happens at compile-time. I realized that it would be much neater if I could just make the descriptor objects immutable, and put them into immutable variables, and have the optimizer do the job instead.
<ijp>taylanub: (eval '(let ((x 5)) (+ x 3)) (environment (import (rnrs) (rename (+ -) (- +)))))
<ijp>bah, not environment, what's it called again
<taylanub>ijp: Ah, good point. + is not an immutable variable. :)
<ijp>that wasn't the point
<taylanub>It was! You just don't know it. :P
<ijp>the point is that you are conflating syntax and semantics
<ijp>syntax has nothing to do with meaning
<ijp>hmm, no, environment was right, it's just that (rnrs eval) is not a part of (rnrs) by default
<ijp>(eval '(let ((x 5)) (+ x 3)) (environment '(rename (rnrs) (+ -) (- +)))) $3 = 2
<taylanub>Yeah, my problem is that I started to think of syntax as a way to optimize through immutability.
<ijp>taylanub: and in r6rs imports are immutable
<taylanub>Whereas optimization through immutability should not be done via syntax, but through more direct immutability facilities.
<taylanub>Syntax just so happens to be "run-time-immutable."
<taylanub>(Because it's always evaluated and eliminated at compilation.)
<taylanub>(Similar to how operations on immutable values can be evaluated at compile-time and thus eliminated in the resulting program.)
<ijp>apparently I'm not speaking english
<taylanub>Must be a British English vs. Turk-German English problem!
<taylanub>(Or maybe I'm speaking Taylanubian English.)
<fbs>im still 'wat'
<taylanub>fbs: TL;DR: You can use fancy macros to force some computation into compile-time, but you should look out for cases where the optimizer could do this just as well, without needing you to obfuscate your program's semantics. (Because macros are only one of potentially many things that can be and are evaluated at compile-time.)
<ijp>relying on a non-guaranteed optimisation is dodgy at best
<ijp>because, dundundun, they aren't guaranteed
<taylanub>True, if writing portable Scheme your only way to make sure that something happens at compile-time is to use macros. (Even then, your code could be executed by an interpreter. :P)
<ijp>taylanub: portable across different versions of the same program
<ijp>you don't need to go as far as cross platform or cross implementation
<taylanub>You mean different versions of the same Scheme implementation ? True.
<ijp>I don't think we are likely to remove the current optimisation passes, but I don't think we exactly say what we try to optimise either
<taylanub>Yeah. OTOH, if there was some good immutability support, then the relevant optimizations would be highly expected or even taken to be a given, I imagine.
<ijp>now, immutability doesn't mean as much as you think
<ijp>proof: the untyped lambda calculus is capable of universal computation
<taylanub>Well yeah, both a variable and the object it holds need be immutable, AND the object created at compile-time.
<ijp>reread the last thing I typed
<taylanub>Oh ...
<ijp>(this is actually also true of syntax-rules)
<taylanub>Indeed, I had that question in mind too: how do you make sure that "partial evaluation" terminates ?
<ijp>we count
<taylanub>Count ?
<ijp>too many iterations and it bails
<taylanub>I see. Same strategy would be used for immutables then.
<taylanub>Well
<taylanub>Immutables' optimization *is* partial evaluation.
<ijp>yes
<ijp>except for things like file handles
<add^_>Gosh, every darn time I hear about the lambda calculus, it just seems to darn awesome, I need to learn more about it.. ijp: is there a good book about it? Or is it just a LOT of papers?
<ijp>which are technically immutable, but representing a somethign mutable
<taylanub>add^_: I learned it from the Wikipedia article.
*ijp glares at haskell
<add^_>Well, then I'm set too, but I want to know more! xD
<taylanub>The untyped one, at least. The typed one I don't know if I know; I think I kind of know it but I didn't go through the article fully and get that "yeah OK, that's all, I got it" feeling.
<ijp>barendreght is the standard reference, but it's not exactly cheap
<add^_>barendreght... hm ok
<add^_>Will look it up
<fbs>sounds dutch
<ijp>jao: how was the seldin book on the lambda calculus?
<add^_>- the h?
<ijp>I know you were reading it, it used to pop up on the sidebar of your web site :)
<ijp>fbs: I believe he is, yes
<add^_>well $25 isn't that bad..
<jao>ijp, quite good
<ijp>strange what your mind remembers. I often lose my keys, but I can remember obscure facts about jao's website layout from a while back.
<jao>ijp, should i be afraid? ;-)
<ijp>I dunno, maybe
<taylanub>The Oatmeal had a comic on that.
<ijp>the oatmeal has a comic for everything
<ijp>I hate that
<add^_>Wow, ijp I just saw the book for $186 xD
<add^_>I have no idea why the price is so... different from the amazon one
<fbs>both new?
<ijp>probably an early edition
<add^_>The cheap one?
<add^_>Or the expensive one?
<ijp>no the expensive one
<add^_>Ah
<add^_>Right
<ijp>hmm, I think looking at amazon, I might have overpaid
<add^_>How much?
<ijp>ah no I didn't, I just forgot the same order had two other books
<add^_>ah ok
<add^_>:-)
<ijp>hmph, my corrected third edition of feller costs around £56 (according to amazon), but I think I paid ~£5 at a charity shop
<ijp>the second volume costs twice as much!!
<add^_>I might buy the lambda calculus book, is there something else I can read while I don't have it?
<ijp>add^_: caution: have you actually read a mathematics book before?
<add^_>hm
<fbs>its not fun
<ijp>fbs: well, depends, but it's slow going
<add^_>Not really, I have read some stuff, but not really an entire book
<ijp>okay great, backup failed
*ijp groans loudly
<add^_>fbs: it depends if one's interested in it or not ;-)
<add^_>I was going to read... what was it called? the story of the square root of minus i or something
<add^_>I don't remember
<fbs>imaginary numbers
<add^_>Yeah
<ijp>well, that book would have the advantage of a plot
<add^_>I never got the book though :-(
<add^_>Oh well
<add^_>Err, don't read that as "I purchased it but didn't get it"
<add^_>I never got to getting it.
<add^_>Anyway, what were you thinking about ijp? Sorry about your backup..
<ijp>I'm just saying, it's quite a hardcore book.
<add^_>:-/
<add^_>I should probably go for something "lighter".
<add^_>If that's what your suggesting
<ijp>that is sort of what I'm suggesting
<add^_>:-)
<ijp>but some people like to learn to dive by jumping off niagara falls into a barrel
<add^_>Well, I don't think I'd be able to learn anything if I died in the process
<ijp>well, it also suffers in that it doesn't cover types, if you are interested in those
<add^_>types? as in typed lambda calculus?
<add^_>Nah, don't really care.
<ijp>the basics of the lc, are implicitly covered in any functional programming text
<add^_>Yeah, which is also a reason why I want to learn more about it
<add^_>Because when I read say... a paper about monads or cps, there's loads of lambda calculus and I have almost no idea what to make of it
<ijp>they don't usually go too deep into it
<ijp>maybe semanticists do
<add^_>Hmm
<ijp>everything I know about denotational semantics I've inferred from stuff filinski wrote
<ijp>which is a very bad way to learn, IMO
<add^_>lol