IRC channel logs

2019-12-08.log

back to list of logs

<spk121>guile3 will need work to compile with the standard Debian hardening flags
<rlb>spk121: haven't tried it yet, but thanks for the note. Worst case we can probably just back the flags off until the issues are sorted.
<rgrinberg>Is there a good example of a guile project with a standard way to setup tests/compilation/dependencies?
<daviid>rgrinberg: most of us use the (gnu autotool chain, there are numerous guile projects to look at if yu fllow that 'route'
<daviid>str1ngs: fwiw, I pushed the (new) doc for the 'Closure' high levelAPI, then I merged everythig I did so far to mster as well
<daviid>I should provide some (new) doc for the g-idle (and friends ) procs, then get back to work on signals ...
<catonano_>rgrinberg: there' s a project called "guile-hall" it creates a project skeleton for you with building (with tetss for dependencies) and testing.The weak point is when you have to wrap C libraries. It doesn' t support them, you' ll have to edit your config.ac file by hand, in that case
<catonano_>rgrinberg: the project it prepares for you is based on the autotools. Autoconf supports Guile to some extent and guile-hall leverages that
<catonano_>rgrinberg: https://gitlab.com/a-sassmannshausen/guile-hall
<catonano_>rgrinberg: does that help ?
<wingo>spk121: what hardening flags are those? feel v free to commit any fixes directly btw
<wingo>something is wrong about in-loop tier-up
<wingo>i.e. not kicking in like it should
<wingo>array1 takes 23s with normal GUILE_JIT_THRESHOLD and 6s with GUILE_JIT_THRESHOLD=0
***janneke_ is now known as janneke
<str1ngs>sneek: later tell daviid sound good thanks.
<sneek>Okay.
<optima>I want to improve my guile skills for guix, but I can't pin point the use case for guile
<optima>Because when I look at python, I know it's great for short scripts or glue code
<optima>C for speed and accuracy
<optima>But what do I do with guile?
<rotucer>Achieve enlightenment.. Jk. Guix is good reason on it's own to learn it tbh
<zig>optima: I am using guile to create a multithread web application that can query wikidata, this is not possible as-is with known Python implementations.
<zig>in the sense that all Python implementation (except Jython 2.7) have a global interpreter lock, and for my work I need multiple threads.
<rgrinberg>catonano_ yeah that looks like a good start, thanks. I thought I could avoid autotools though :/
<zig>optima: also, if you forgo goops, guile is much simpler that Python (i never used goops so far, 5 years of guile)
<zig>optima: there is also the fact that guile is lisp, the syntax is a no-brainer.
<catonano_>rgrinberg: you can avoid them. They are needed only in some cases. If your project wraps a C library AND you want to distribute it. Or if you want to distribute your project in a way that it can be installed for all the users on a system. Otherwise you can keep your source files scattered around and they will be compiled automatically when you use them
<catonano_>optima: you can use Guile for all the tasks where bash is used. Further Guile can wrap C libraries and access to data structures (without marshalling the output to test and parsing that again every time)
<catonano_>optima: you could easly do a lsp server or an xmpp node
<catonano_>optima: you could do something that creates svg files that get visualized by external tools
<chrislck>#guilers!
<chrislck>I've just discovered that when guile is called from C, structs are accessible via structname-varname-set structname-varname-get
<chrislck>Is this a supported API?
<chrislck>or is it an implementation detail?
***ng0_ is now known as ng0
<zig>nomunofu is almost ready. I am trying with a one billion triples. It might take a few days.
<zig>then guix pack, then something, then profit!
<zig>;-)
<apteryx_>Hello! I'm trying to format an Elisp expr to pass to Emacs, and I can't seem to escape that one: (emacs-batch-eval `(mapc \#'load ,autoloads))
<apteryx_>the #' is giving me troubles; Guile expands it to: (mapc #{\#'load}# ("some-file.elc")), for example.
<apteryx_>any idea?
***apteryx_ is now known as apteryx
<apteryx>basically, can I form a symbol starting by the # key in Scheme?
<chrislck>'#{#sym}#
<apteryx>It seems for the use case at hand, it makes more sense to just pre-format the string, as it eventually gets formatted as such
<apteryx>chrislck: interesting, that's how Guile expands a variable named with \#some-name. How would you later evaluate (refer to) such a variable?
<civodul>hey there!
<civodul>wingo: congrats on the 2.9.6 release and associated benchmarks!
<civodul>the performance difference compared to 2.2 is impressive
<civodul>the right-hand side of the graph is more intriguing, but hey ;-)
<rlb>wingo zig: I ended up fixing the earlier compile hang by just switching to a goops wrapper class, i.e. not calling (class-of (future ...)) during compilation to get the class. Thanks for the help. It'd eventually be nice if <future> were available and something you could use in method specializations.
<rlb>(Might also be nice if threads didn't hang compilation, but not critical.)
<rlb>wingo: regarding flags -- not sure exactly what spk121 was seeing, but current flags appear to be:
<rlb>$ dpkg-buildflags --dump
<rlb> CFLAGS=-g -O2 -fdebug-prefix-map=/home/rlb=. -fstack-protector-strong -Wformat -Werror=format-security
<rlb> CPPFLAGS=-Wdate-time -D_FORTIFY_SOURCE=2
<rlb>
<rlb>via "dpkg-buildflags --dump"
<rlb>RhodiumToad: filed bug with patch for the thread local fluid-ref problem. Thanks again.
<dftxbs3e>hey, why does GNU Guile not use all the available threads to compile?
<dftxbs3e>Only 8 of them are used at 100% even though 24 are available
<dftxbs3e>GNU Guile spawns 24 threads, but only uses 8 of them
<wingo>fixed the array1 problem :P
<wingo> https://wingolog.org/pub/guile-2.2.6-vs-2.9.7-microbenchmarks.png
<wingo>the mbrot result is... surprising
<jcowan>I think the slowdowns at the other end are more surprising
<jcowan>Mandelbrot benefits hugely from compiling away boxing
<civodul>wingo: i'm sure you precompiled the result of mbrot, didn't you ;-)
<wingo>i mean that result is top of the leaderboard
<wingo>1.9s on this machine, compared to gerbil/gambit getting around 3.3s on ecraven's benchmark rungs. dunno how that is happening
<wingo>*runs
<wingo>ah well, outliers.