Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'll give this a shot after work I think.

The question I have is what's a prompt which reliably hallucinates but still produces the correct answer some of the time?

I know it gets some python functions "wrong", but i think they were actually "right" in the version it was trained on, so software seems out.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: