Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What modern ML uses those techniques? my understanding is ebm is quite rare


The common method for choosing the next output token for an LLM is sampling from a Boltzmann distribution. If you have seen the term "temperature" in the context of language models, that is a direct link to the statistical gas mechanics.


i don't find the connection between softmax and boltzmann really all that deep tbh (compared to say, the connection between field theory/ising models and EBM)


Research mostly, lots of these techniques are used for speech synthesis and occasionally image models (not LLMs)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: