Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Must be, because in the field of artificial intelligence, if these techniques are not in production and considered obsoletes it's for a good reason.

It may have been state-of-the-art in 1980s, but now is a bit late.

Very smart people in their time though.

In current times, a global prize to the transformers folks at least make more sense considering the context (despite it not being Physics).



The landmark Deep Belief Networks (stacked RBMs) paper in Science was in 2006 [1]. DBNs were completely obsolete quite quickly, but don't deny the immense influence of this line of research. It has over 23k citations, and was my introduction to deep learning, for one. And cited by the Nobel committee.

You're completely incorrect to say RBMs were of theoretical interest only. They have had plenty of practical use in computer vision/image modelling up to at least a few years ago (I haven't followed them since). Remember the first generative models of human faces?

Edit: Wow, Hinton is still pushing forward the state of the art on RBMs for image modelling, and I am impressed with how much they've improved in the last ~5 years. Nowhere near diffusion models, sure, but "reasonably good". [2]

[1] G.E. Hinton and R. Salakhutdinov, 2006, Science. "Reducing the Dimensionality of Data with Neural Networks"

[2] "Gaussian-Bernoulli RBMs Without Tears" https://arxiv.org/pdf/2210.10318




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: