IRC channel logs


back to list of logs

<rekado>I recently watched a video on micrograd as a way to learn backpropagation and gradient descent in neural networks.
<rekado>it’s written in Python, so I thought it would be a good idea to translate it to Scheme.
<rekado>all the way I was annoyed by the OOP style that underpins graph representations and algorithms
<rekado>found this paper on inductive graphs:
<rekado>and implemented a little graph library:
<rekado>I’d be happy to see more Scheme in the space of neural networks.
<rekado>we have aiscm, but I’d be happier if it didn’t tied me to tensorflow.
<civodul>looks nice!
<civodul><context> is used in a way similar to the "zipper" structure, right?
<rekado>match-context works like a zipper
<rekado>it matches on a graph and returns the context + the remaining graph (without that context).
<rekado>match-context and graph-cons are a bit expensive because they need to protect the ordering invariant that a context can only refer to nodes that already exist in the graph
<rekado>now that I have this little graph library I see that it suggests being used in a monad
<rekado>micrograd is a little expression language that generates a graph as you apply operations to values; each resulting value keeps references to the terms that went into the operation.
<rekado>when working with contexts, though, it seems that we should thread the graph through operations —> monad
<civodul>ah yes, could be
<civodul>fun, i'm offered to participate in an event 3 weeks from now at the other end of the world
<zimoun>civodul: ahah! Is it Guix 10 Years ? Because Paris is not the other end of the world; I mean from your place. ;-)
<efraim>Something in Japan?
<nckx>The (populated) antipode of Paris is Waitangi, Chatham Islands, New Zealand.
<nckx>I guess they have a LUG.
<drakonis>are they paying for the flight?
<rekado>the (chartered) ship has already sailed