Are the gradient visualizations not doing it for you?
Of course it kind of breaks down as the gradient can no longer be visualized as an arrow in 2D or 3D space and not all concepts transfer as easily to higher dimensions, as one would hope, but some do.
It is quite different, because one thing is to look to a math expression like SDF and understand the 3D shape that comes out of it, the math behind a demoscene plasma field, or a ray traced shape.
Other is making heads of tails of what a neural network with backpropagation means.
The PyTorch3D section was genuinely useful for me. I've been doing 2D ML work for a while but hadn't explored 3D deep learning — didn't even know PyTorch3D existed until this tutorial.
What worked well was the progressive complexity. Starting with basic mesh rendering before jumping into differentiable rendering made the concepts click. The voxel-to-mesh conversion examples were particularly clear.
If anything, I'd love to see a follow-up covering point cloud handling, since that seems to be a major use case based on the docs I'm now digging through.
Thanks for writing this — triggered a weekend deep-dive I probably wouldn't have started otherwise.
This does an honest good job of walking through the beginnings, I would still say understanding/decomposing a decision tree and going through the details and choices /trade offs one makes with how they prepare the tree like binary split or discrete/binning for continuous data. What reducing entropy means, etc. Maybe even start with parametric versus nonparametric modeling pros/cons. You really get to see how probability and statistics is applied in the formulas that eventually will be thrown into a dot function in python.
There is a lot of content on pytorch, which is great and makes a ton of sense since it's used so heavily, where the industry needs a ton of help/support in is really the fundamentals. Nonetheless, great contribution!
This was quite accessible. If I had to pick one point, I wish there was more "handholding" from gradient to gradient-descent i.e. in the style of the math-focused introduction of the function with one parameter, two parameters etc that was done. It felt a bit of sudden jump from the math to the code. I think the gentle introduction to the math is very valuable here.
Are there other similar tutorials like this going into fundamentals of model architectures for example? Something like https://poloclub.github.io/cnn-explainer/ for example
Interesting article. It would be really useful if you have added a full article title to the page meta data, so it would get bookmarked with title. I assume one does not require GPU to try out simple examples provided?
Very nice, thanks! It’s great to be able to play with viz!
For a deeper tutorial, I highly recommend PyTorch for Deep Learning Professional Certificate on deeplearning.ai — probably one of the best mooc I’ve seen so far
Thank you so much. Really appreciate the thoughtful feedback!
I've watched many intros. Somehow they always end with 90%+ accuracy and that was just not my experience while learning on datasets I picked myself. I remember spending hours tuning different parameters and not quite understanding why I was getting way worse accuracy. I showed this intentionally, and I'm glad you commented on this!
https://0byte.io/articles/neuron.html
https://0byte.io/articles/helloml.html
He also publishes to YouTube where he has clear explanations and high production values that deserve more views.
https://www.youtube.com/watch?v=dES5Cen0q-Y (part 2 https://www.youtube.com/watch?v=-HhE-8JChHA) is the video to accompany https://0byte.io/articles/helloml.html
reply