https://0byte.io/articles/neuron.html
https://0byte.io/articles/helloml.html
He also publishes to YouTube where he has clear explanations and high production values that deserve more views.
https://www.youtube.com/watch?v=dES5Cen0q-Y (part 2 https://www.youtube.com/watch?v=-HhE-8JChHA) is the video to accompany https://0byte.io/articles/helloml.html
What worked well was the progressive complexity. Starting with basic mesh rendering before jumping into differentiable rendering made the concepts click. The voxel-to-mesh conversion examples were particularly clear.
If anything, I'd love to see a follow-up covering point cloud handling, since that seems to be a major use case based on the docs I'm now digging through.
Thanks for writing this — triggered a weekend deep-dive I probably wouldn't have started otherwise.
Yet, 2D and 3D graphics feel relatively natural, maybe because at least I can visualize that kind of math.
Of course it kind of breaks down as the gradient can no longer be visualized as an arrow in 2D or 3D space and not all concepts transfer as easily to higher dimensions, as one would hope, but some do.
Other is making heads of tails of what a neural network with backpropagation means.
There is a lot of content on pytorch, which is great and makes a ton of sense since it's used so heavily, where the industry needs a ton of help/support in is really the fundamentals. Nonetheless, great contribution!
For a deeper tutorial, I highly recommend PyTorch for Deep Learning Professional Certificate on deeplearning.ai — probably one of the best mooc I’ve seen so far
https://www.deeplearning.ai/courses/pytorch-for-deep-learnin...
Free book: https://zekcrates.quarto.pub/deep-learning-library/
Ml by hand : https://github.com/workofart/ml-by-hand
Micrograd: https://github.com/karpathy/micrograd
Thank you, Sol Roth