Did something similar a while back [1], best way to learn neural nets and backprop. Just using Numpy also makes sure you get the math right without having to deal with higher level frameworks or c++ libraries.
If you are asking about the "micrograd" video then yes a little bit. "micrograd" is for scalars and we use tensors in the book. If you are reading the book I would recommend to first complete the series or atleast the "micrograd" video.
It's alright, but a C version would be even better to fully grasp the implementation details of tensors etc. Shelling out to numpy isn't particularly exciting.
I agree! What NumPy is doing is actually quite beautiful. I was thinking of writing a custom c++ backend for this thing. Lets see what happens this year.
If someone is interested in low level tensor implementation details they could benefit from a course/book “let’s build numpy in C”. No need to complicate DL library design discussion with that stuff.