Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does openai actually specify the size of the model?

InstructGPT 2B outperformed gpt 3 175B, and chatgpt has a huge corpus of distilled prompt -> response data now.

I’m assuming most of these requests are being served from a much smaller model to justify the price.

OpenAI is fundamentally about training larger models, I doubt they want to be in the business of selling A100 capacity at cost when it could be used for training



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: