Yeah, I've been using this trick to implement a hot reload library, to only update the specific functions that are changed without re-executing modules: https://github.com/breuleux/jurigged
I also use it in a multiple dispatch library (https://github.com/breuleux/ovld) to replace the entry point by specialized dispatch code in order to cut some overhead.
It's fun.
Also, why is every damn post these days somehow framed in an AI context? It's exhausting.
> Also, why is every damn post these days somehow framed in an AI context? It's exhausting.
Every 5/10-year segment of my life has somehow had one or two "This is the future"-hypes running concurrently with my life. Previously it was X, now it's Y. And most of the times, everything else is somehow connected to this currently hyped subject, no matter if it's related or not.
The only thing I've found to be helpful is thinking about and changing my perspective and framing about it. I read some article like this which is just tangentially related to AI, but the meat is about something else. So mentally I just ignore the other parts, and frame it in some other way in my head.
Suddenly people can write their articles with attachments to the hyped subject but I don't mind, I'm reading it for other purposes and get other takeaways that are still helpful. A tiny Jedi mind-trick for avoiding that exhaustion :)
I also find it useful to keep in mind (often more junior) people are learning new things and expressing their joy, which is a good thing. and most (junior) people learning things in tech right now are doing so in the context of AI, for better or worse
(idk if this author is “junior” per se, mostly just agreeing the shift in perspective is helpful to not get burnt out by things like this)
AI, block chain, rust, go, serverless, nosql, ruby on rails..... The list goes on and on :-)
Some of it gets really annoying on the business side, because companies like Gartner jump on the trends, and they have enough influence that businesses have to pay attention. When serverless was a thing, every cloud provider effectively had to add serverless things even if it made zero sense and no customers were asking for it, simply to satisfy Gartner (and their ilk) and be seen as innovating and ahead of the curve. Same thing happened with block chain, and is currently happening with AI.
wheels get reinvented again again again and again … this is quite unique to info tech … imagine mathematicians do the same, the world would be in chaos …
> wheels get reinvented again again again and again
And yet, it took us (humans) a long time to turn the wheel around and using it for transportation, so maybe we need reinvent even more things, turn them around and such :)
Oh that is really interesting, I was just aware of IPython's autoreload extension, I hadn't found your library. I'm also working on hot reload for python as I'm working on a development environment for python that aims to give it a development experience closer to lisp: https://codeberg.org/sczi/swanky-python/
Some minor details. You currently aren't updating functions if their freevars have changed, you can actually do that by using c-api to update __closure__ which is a readonly attribute from python:
Also I think you should update __annotations__, __type_params__, __doc__, and __dict__ attributes for the function.Rather than using gc.get_referrers I just maintain a set for each function containing all the old versions (using weakref so they go away if that old version isn't still referenced by anything). Then when a function updates I don't need to find all references, all references will be to some old version of the function so I just update that set of old functions, and all references will be using the new code. I took this from IPython autoreload. I think it is both more efficient than gc.get_referrers, and more complete as it solves the issue of references "decorated or stashed in some data structure that Jurigged does not understand". The code for that is here: https://codeberg.org/sczi/swanky-python/src/commit/365702a6c...
hot reload for python is quite tricky to fully get right, I'm still missing plenty parts that I know about and plan on implementing, and surely plenty more that I don't even know. If you or anyone else that's worked on hot reload in python wants to talk about it, I'm happy to, just reach out, my email is visible on codeberg if you're signed in.
Thanks for the tips, I'll try to look into these when I get some time! Didn't know you could modify the closure pointer.
I'm not sure what you mean by "maintaining a set of old versions". It's possible I missed something obvious, but the issue here is that I have the code objects (I can snag them from module evaluation using an import hook)... but I do not have the function objects. I never had them in the first place. Take this very silly and very horrible example:
The adders dictionary is dynamically updated with new closures. Each is a distinct function object with a __code__ field. When I update the inner function, I want all of these closures to be updated. Jurigged is able to do it -- it can root them out using get_referrers. I don't see how else to do it. I quickly tested in a Jupyter notebook, and it didn't work: new closures have the new code, but the old ones are not updated.Oooh now that is interesting. What I by mean stuff I don't even know that I don't know :)
Yes mine doesn't handle that, it is the same as jupyter there. Smalltalk is supposed to be best at interactive development, I wonder if it will update the old closures. I don't know it to try, but I do know Common Lisp which is also supposed to be quite good, and fwiw it behaves the same, new closures have the new code, but the old ones are not updated:
Interesting. It appears that not many systems have been designed to enable hot reload in a thorough way.
Here's another fun complication: what if I have a decorator that performs a code transform on the source code of the decorated function? If the code is changed, I would like to automatically re-run the transform on the new code. I made a (kind of awkward) protocol for this: https://github.com/breuleux/jurigged?tab=readme-ov-file#cust.... It's a little confusing even for me, so I should probably review it.
Wow it looks like you are taking reloading to another level I hadn't even considered.
For example in your elephant:main.py test, in swanky python I run do(3): ['Paint 3 canvasses', 'Sing 3 songs', 'Dance for 3 hours']
change songs to songz, and now do(3) is: ['Paint 3 canvasses', 'Sing 3 songs', 'Dance for 3 hours', 'Sing 3 songz']
Rather than changing the earlier songs to songz as jurigged manages to. But any lisp environment would behave the same, we don't have the idea of:
> 3. When a file is modified, re-parse it into a set of definitions and match them against the original, yielding a set of changes, additions and deletions.
We are just evaling functions or whatever sections of code you say to eval, not parsing files and seeing what was modified. So in some cases we might need to make a separate unregister function and call that on the old one. Like in emacs if you use advice-add (adds before, after, and other kinds of hooks to a function), you can't just change the lines adding an advice and save the file to have it modify the old advice, you need to explicitly call advice-remove to unset the old advice, then advice-add with your new advice, if you want to modify it while running without restarting.
When I eval a function again I am evaling all decorators again, in your readme you write the downsides of that:
> %autoreload will properly re-execute changed decorators, but these decorators will return new objects, so if a module imports an already decorated function, it won't update to the new version.
But I think I am handling that, or maybe you have other cases in mind I am missing? ie in a.py:
Then in b.py, from a import reload_me, change reload_me in a and slime-eval-defun it, and b is using the new version of reload_me. Basically for all (__module__, __qualname__) I am storing the function object, and all old versions of the function object. Then when there is a new function object with that name I update the code, closure and other attributes for all the old function objects to be the same as the new one.I'll look into maybe just integrating jurigged for providing the reloading within swanky python. I was using the ipython autoreload extension at first, but ran into various problems with it so ended up doing something custom still mostly based on ipython, which is working for me in practice for now. So as long as I don't run into problems with it I'll focus on the many other parts of swanky python that need work, but sooner or later when I inevitably run into reloading problems I'll evaluate whether to just switch reloading to use jurigged.
> Lisp and Smalltalk addressed this by not unwinding the stack on exceptions, dropping you into a debugger and allowing you to fix the error and resume execution without having to restart your program from the beginning. Eventually I'd like to patch CPython to support this
yea i've been meaning to do this for a while as well...
I haven't started really looking into it yet, but I found this blog that looks like a good description of what exactly happens during stack unwinding in python and gets a large part of the way to resuming execution in pure python without even any native code: https://web.archive.org/web/20250322090310/https://code.lard...
Though the author says they wrote it as a joke and probably it is not possible to do robustly in pure python, but I assume it can be done robustly as a patch to CPython or possibly even as just a native C extension that gets loaded without people needing a patched build of CPython. If you know any good resources or information about how to approach this, or start working on it yourself, let me know.
Jurigged is awesome. It works really well and saves me tons of time. Thank you for making it!
I do wish there were callbacks I could subscribe to that would notify me whenever my file changed or whenever any code changed, so I could re-run some init.
My other feature request would be a way to replace function implementations even when they are currently on the stack, as some other hot reload implementations can. But I certainly understand why this would be difficult.
I use jurigged in conjunction with cmd2 to make command line driven tools. The combination works well because I can issue a command, watch the output, make a change, hit up-return and see the change just like that.
Thank you a bazillion for making it. It works quietly in the background without fuss, and I'm grateful for it every time I use it.
> why is every damn post these days somehow framed in an AI context? It's exhausting.
It’s even in the real world now - most of my conversations with people in tech end up at AI eventually.
It kind of reminds me of the 2010s when non-tech people would ask me about crypto at social events.
In some respects that's even nicer to use than a typical editor-integrated live-coding REPL, because one doesn't have to think about what code needs (re)sending from the source to the REPL. Just save the file and it'll be figured out which parts meaningfully changed.
Jurigged is really cool - thanks for the tool!
jurigged is great, love using it for quick GUI prototyping with imgui!