Show HN: Keep your PyTorch model in VRAM by hot swapping code
74 points
1 day ago
| 2 comments
| github.com
| HN
NitpickLawyer
1 day ago
[-]
We use python notebooks for that functionality in the early stages of script testing. Load a cell up top with the model, then do your stuff below, and once things look good convert it to a normal python script.
reply
pizza
1 day ago
[-]
Tensor visualizer app itself already looks pretty interesting
reply
valine
1 day ago
[-]
Thanks, I will do a deep writeup on that at some point.
reply
kombine
1 day ago
[-]
Are you running both DearImGui visualisation and training locally? If not, how can one use it in the client-server mode? I think this is the most common requirement for visualisation libraries in Deep Learning.
reply
valine
1 day ago
[-]
The rendering is done with OpenGL, and for remote viewing I just render to an offscreen framebuffer and stream it back to the client with WebRTC. The code for that isn’t public yet, still needs some cleanup.
reply
iaw
1 day ago
[-]
Yeah, sadly the link to their visualizations is gated behind X.com
reply
CheeksTheGeek
1 day ago
[-]
you can use xcancel.com by adding cancel after the x url
reply