Great idea. Will run some experiments to see how it performs. It sounds analogous to kmeans++ initialization. Sobol sequences ring a bell. Some of the Bayesian optimization software libraries may in fact use a Sobol sequence of initial evaluations. But it may not be well documented.
Don't be sad. I'm happy to update the blog post as needed. By overfitting, do you mean over-optimizing the results on the validation set? Based on what I understand about nested CV, it is only necessary if 1. the hold-out validation set is way too small and not representative of the overall data distribution, or 2. if the model training procedure itself is unstable and produces models with wildly varying results on the same dataset.
To prevent overfitting to the training data, one performs hold-out validation or cv or early stopping in the training process.
To prevent overfitting of hyperparameters to a small validation dataset, or to mitigate the variance of the model training outcome, one can use nested cv.
It is a common misconception and a huge source of disappointment with ML -- without proper validation of the whole model building procedure (method selection + parameter tuning + feature selection + fitting) no amount of data and magic tricks will make you sure that there is no overfitting. Even a single hold-out test is risky because gives you no idea about the expected accuracy variance.
Linear algebra is one of my favorite subjects. Its design is beautifully simple, yet extremely powerful. Half of modern machine learning (and all of Matlab) is built on matrix algebra. And the existence of fast software for numeric linear algebra makes it practically applicable.
The link to graph theory is beautiful, too. Entries in an matrix can represent edge weights, and taking a random walk on a graph can be represented as a matrix-vector multiplication, and the stationary distribution is the singular vector. How cool is that? When you start to link together abstractions from different fields of mathematics and science, you get these fantastic insights that are just mind bogglingly awesome. This is what makes all the pain of wading through an ocean of symbols and equations worthwhile, imho.
Cool! Thanks for the link. I've not read that book, but it looks very interesting and related. It great to see others' perspective on math as a human construction.
Another book on the topic of history of mathematics is "Journey Through Mathematics" by Enrique Gonzalez-Velasco. From its back cover:
"This book offers an accessible and in-depth look at some of the most important episodes of two thousand years of mathematical history. Beginning with trigonometry and moving on through logarithms, complex numbers, infinite series, and calculus, this book profiles some of the lesser known but crucial contributors to modern day mathematics."