Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it an evaluable fact that any particular LLM "is substantially less often correct" than what?

The comparison to biology is to ask if what is termed "hallucinations" are different from what human minds do.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: