<please_help>I thought snow-fort hadn't been used for a long time, guess I was wrong. <duud`>please_help: what are you actually working on? I mean you have a lot of questions :) <please_help>A framework for symbolic computations aimed at machine learning <please_help>including gpu-powered matrix operations, and data management <duud`>please_help: I don't understand the connection between symbolic computation and machine learning. Could you say more about it? But last year a was working also a lot with gpu-based linear algebra. Especially a was working on eigenvector decomposition on sparse high rank matrices, I used cuda for it. <please_help>when performing cost-update in non-trivial network, you typically end up with complicated derivatives. Symbolic computation allows symbolic differentiation and "integration", which is way less work and a lot less error-prone. <please_help>(that is true in ML as opposed to, say, computer vision or natural language processing where most of the job is focused on feature engineering). <please_help>Also, instead of going with straight cuda (which would also require having a cpu implementation of the ops for debugging purposes), I'm wrapping arrayfire, which does optimized+jit'd cuda, cpu and opencl tensor ops. <please_help>(also other cool and necessary stuff like convolution, fft, and sampling) <please_help>they've just went open-source (under a 3-clause BSD license) which is pretty cool. <duud`>never heard about arrayfire, But now I see why you're using scheme. <please_help>It has nice features (the module system, the included srfi, the implementation-specific modules, the language stack), seems to be developing in a good direction and has a userbase of size larger than 0 <please_help>My choice was pretty much between guile, chicken and racket, I preferred the way guile worked. <duud`>and don't forget this channel ;) ***cluck` is now known as cluck
***michel_mno_afk is now known as michel_mno
<lloda>please_help: are you also using automatic differentiation? <wleslie>sneek: later tell wingo I remembered why my X1 carbon has a working caps->ctrl mapping, and it's super ugly: (run-shell-command "xmodmap -e 'keycode 66 = Control_L' -e 'clear Lock' -e 'add control = Control_L'") <please_help>lloda: I'm effectively using the same method as theano, and I recall the devs discussing the difference between SD and AD, and pointing out that theano does SD, not AD, so it seems I'm not doing AD either. Truthfully, from the description on wikipedia, I couldn't tell the difference between the two, though. <please_help>Is it simply symbolic differentiation, except using immediate values instead of symbolics? <please_help>If so, I can already do that by forcing evaluation of the symbolic expressions at every stage if all the required variables are already there. <please_help>what I'm doing is essentially source-code transformation (thanks to the lack of distinction between data and code in lisp languages) so in that sense it is automatic differentiation. <lloda>autodiff can handle any program construction such as branches, assignment, etc. one doesn't need an expression for the thing-to-be-derived. To me that's the biggest advantage. <lloda>it also requires less machinery behind the scenes and should be more efficient. I've only implemented forward mode autodiff myself, reverse mode is more complicated. <lloda>symbolic derivations need simplification of terms and that doesn't seem easy, autodiff avoids that. <please_help>I have support for symbolic-let and symbolic-if, and I can install a passthrough (every op that doesn't match could be itself, for instance), but that would of course not be accurate. For example, d/dX (if x > 3 x -x) is the same thing as (if x > 3 1 -1) but NOT the same thing as (if 1 > 3 1 -1) nor (if x > 3 x -x) <please_help>by that token, I don't understand how it can be possible to describe an AD that does this kind of things. <please_help>I already have a system to hook up simplifications (easy) but you're right that adding simplifications isn't necessarily trivial in all cases (though the common ones are) <please_help>however, I also don't see how AD doesn't need simplification in the exact same way. <please_help>(particularly for matrix operations: you'd end up generating 91 GB worth of matrices when after simplification you could end up with less than 500 MB; and numerical stability issues should still be relevant) <lloda>I suppose there're many explanations but I learned of it from www.pvv.ntnu.no/~berland/resources/autodiff-triallecture.pdf <please_help>(I was going by wikipedia's description). I have to go now, I'll check this out and report back. <lloda>people have derived whole FEM programs <lloda>you're welcome! forward mode is really simple to implement, almost trivial <please_help>lloda: I was right about AD needing simplification (lest you generate xy -> 1y + x0, when you just want y), and if and assignment having to be handled ad-hoc. AD will also be slower because you're calculating both the "forward" and "backward" parts when you only want the backward part most of the time, and I'm told (though this is hearsay) that it is also very bogged down in several simulation scenarios (mcmc methods I <please_help>so the advantage of AD is that you don't need to engineer the derivative, though you do have to redefine every function you're using to work on the dual number + regular number and take into account d^2 = 0. <please_help>on the other hand, symbolic will ultimately be faster in most scenarios. I don't think there are any other differences (more like people on both sides who don't understand the other method). <please_help>actually I'm wrong, you do have to engineer some derivatives for AD, like e^x, log, cos/sin, etc. <davexunit>gnusosa: any progress on the libgit2 hacking? <davexunit>I've looked at the docs more, and the html contains everything needed to automatically generate the low-level bindings. <davexunit>I'm feeling motivated enough to take a stab at generating them, but only if I'm not stepping on your toes. :) ***michel_mno is now known as michel_mno_afk
<gnusosa>davexunit: go ahead. I say things but usually don't have the time. <gnusosa>davexunit: just share your work, so I can jump and contribute <gnusosa>davexunit: I'm interested in this part. "automatically generate the low-level <lloda>please_help: forward works best when you have f: R -> Rn, backwards when you have f: Rn -> R. For f: Rm -> Rn people do combinations and it can get involved. You can use the forward or the backward methods in isolation, so I don't understand the part about calculating both and keeping only the backward. <lloda>you do need to redefine every basic function, yes. <lloda>I don't understand that part about engineering e^x, etc, but I've only used the forward method. There you define your f(dual(x)) and that's it. <lloda>when you can get a compact simplified symbolic expression for the derivative, clearly that's the best scenario. <please_help>forward and backward were poor choices of words in that case, in the case of neural-network it refers to calculating the layer-wise activation and layer-wise backprop contribution respectively. <lloda>well, one cannot be do a survey of all other fields before deciding terms. bw and fw are pretty generic terms. <please_help>that's why I said they were poor choices of words in that case. <lloda>maybe the neural network guys chose poorly then :p <please_help>I think you're not understanding the intending meaning of what I'm saying. <please_help>that case refers to my first reply after reading the slides. <lloda>if you write the expression tree as a series of assignments, the result will be at the bottom. <lloda>also assignment & branching are transparent up to the point where you extract the function part and the derivative part. I've had that experience, switch the type, bam, I have the derivatives, and I didn't have to touch the program otherwise (a few hundred lines of gnarly C++ with clases and stuff). <davexunit>gnusosa: current blocker is that I can't actually figure out how to generate the html documentation for myself <davexunit>it seems like they have some web service to do it and the web pages require javascript to render. :( <gnusosa>davexunit: so you're not generating source code out of the documentation? <gnusosa>davexunit: can I ask how to do that? is there a library in Guile for that? <davexunit>gnusosa: you have to build something yourself <gnusosa>I've done low-level bindings in Ruby/C but never automatically. <davexunit>depending on what the project has available. <davexunit>but I can see from the libgit2 reference that it has every single function, enum, and struct documented and specified. <gnusosa>Oh I see you will scrape from the order of the DOM <davexunit>omg what is it with the ruby community and circular dependencies? <davexunit>docurium, the ruby program to build libgit2's documentation as html, depends on rugged, the ruby bindings for libgit2 <davexunit>WARNING: Pygments not found. Using webservice. <mark_weaver>is it possible to reduce the dependency graph to a DAG by removing optional dependencies? <paroneayea>the packaging system hides the circular dependencies for you <dsmith-work>Why are people always writing replacements for stuff? Why not just fix or extend what is already there? <davexunit>mark_weaver: not in this case. for other ruby libraries I've had to disable test suites. <davexunit>because in Ruby land, those are "development dependencies" <gnusosa>davexunit: pygments is a python package <davexunit>but to build from the CCS and have some faith that the software is behaving, you need the dev dependencies. <mark_weaver>well, presumably there were earlier versions of these packages that didn't have the circular dependency, or else it wouldn't have been possible to get to this point. <davexunit>and I'm apparently one of the few people in the known universe that are aware of this issue. <davexunit>gnusosa: yeah, I can install it, but the fact that docurium just uses a web service when it can't find pygments is infuriating. <mark_weaver>congratulations. not often can someone say those words :) <davexunit>the first ruby project I noticed this about was rspec <davexunit>and when I asked the rspec maintainer about it, he didn't understand the issue <davexunit>because everyone just uses pre-built gems from rubygems.org, so why am I trying to build from source without using gems from that site? <davexunit>this is more of a discussion for #guix, now. <davexunit>I will try doxygen on libgit2 sometime later. <davexunit>and I can go from there, perhaps going the guile-xcb route of writing a custom reader. *gnusosa gnusosa reads about guile-xcb <davexunit>and the other files in that directory. cool stuff. <paroneayea>so a real advantage of match-lambda is not just picking the function as a kind of case <paroneayea>davexunit: my use case: in sly context it looks like I could signal-zip twoo values, apply them to a signal-map/fold and extract them <davexunit>signal-zip is if you want to treat them as a list <paroneayea>davexunit: no, I just want to either update life if a user clicks to toggle a single cell or if it's enough ticks <paroneayea>I guess I should probably actually instead switch between "edit mode" and "run mode" <davexunit>paroneayea: just found out that whoever stole my new cable modem threw out the box in my condo building's laundry room. <davexunit>it could very well be a random person, the other residents let solicitors in all the time :/ *daviid thought things like unpleasant confrontations with (un)human (not)beeing only in 3d worlds, so in usa as well... barhh!!! <davexunit>I'm not a city person, and incidents like this make me yearn for moving back to a small town. <davexunit>released as free software, and written in OCaml (I think) <davexunit>I see people raving about OCaml's pattern matcher. does anyone with experience know how our (ice-9 match) stands up to it? <paroneayea>is there a nice builtin "range" function in guile? <paroneayea>daviid: davexunit: thanks, that makes these loops less tedious, into simple folds <daviid>paroneayea: i wrote a dotimes macros among others <davexunit>paroneayea: srfi-42 (i think) has eager comprehensions for building lists from a number range and other things <gnusosa>davexunit: has an example of how to generate the bindings from upstream <gnusosa>basically, it generates definition like `make-glx-procedure` <davexunit>I'm still liking the idea of implementing a new language reader <gnusosa>davexunit: isn't there anything in Guile-lib or anything in the Guile env to do this kind of stuff? <davexunit>gnusosa: it varies from project to project, though. <gnusosa>davexunit: fair enough, but how about you could scrape it to a known template and then just consume that template and push out source code :D <gnusosa>doc -> template -> consumer -> source_code <daviid>davexunit: gnusosa, as a source of inspiration, maybe, guile-gnoe uses gwrap, which uses a python script to produce a .defs file based on .h files ... <daviid>I'm sorry, correction: the python script produces .defs files that gwrap uses [you don't need gwrap ]