I don’t use it for much data science, and I only have used it for one project (to speed up my dad’s Octave code), but I actually really liked it. Generally speaking things were extremely fast, even without any specific optimization on my end. Like, without hyperbole, a direct port of my dad’s code was between 50-100x faster in most cases, and when I started optimizing it I ended up getting it going about 150x faster.

There was a bit of weirdness with the type system with its dynamic dispatch making things slow, but specifying a type in function headers would resolve those issues.

I also thought that the macro system was pretty nice; for the most part I found creating custom syntax and using their own helpers was pretty nice and easy to grok.

Since I don’t do much data work I haven’t had much of an excuse to use it again, but since it does offer channels and queues for thread synchronization I might be able to find a project to use it for.