I originally had it saved as [[ https://www.r2d3.us/visual-intro-to-machine-learning-part-1/ ]] but it seems that link is gone?
The mechanical analog computers of old (e.g. https://youtu.be/IgF3OX8nT0w, or https://youtu.be/s1i-dnAH9Y4) are examples too that math is more than symbol manipulation.
Obviously this is beautiful as art but it would also be useful to understand how exactly these visualizations are useful to people who think they are. Useful to me means you gain a new ability to extrapolate in task space (aka "understanding").
https://www.youtube.com/watch?v=eMXuk97NeSI&t=25
It says the input has 3 dimensions, two spatial dimensions and one feature dimension. So it would be a 2D grid of numbers. Like a grayscale photo. But at 00:38 it shows the numbers and it looks like each of the blocks positioned in 3D space holds a floating-point value. Which would make it a 4-dimensional input.
The neural network visualization is particularly well done - seeing the forward and backward passes in action helps build the right mental model. Would be great to see more visualizations covering transformer architectures and attention mechanisms, which are often harder to grasp.
For anyone building educational tools or internal documentation for ML teams, this approach of animated explanations is really effective for knowledge transfer.