Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The common method for choosing the next output token for an LLM is sampling from a Boltzmann distribution. If you have seen the term "temperature" in the context of language models, that is a direct link to the statistical gas mechanics.


i don't find the connection between softmax and boltzmann really all that deep tbh (compared to say, the connection between field theory/ising models and EBM)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: