Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Definitely room for multiple approaches, including local LLMs.

But I just don't think for most users that local LLM capabilities will be a deciding factor in either hardware or OS choices.

A cloud subscription model will be the premium offering ($20 for consumers, $100 to $1000 or pay-per-token for businesses), and inevitably something ad-supported at a lower price or free for low-end consumers.

Once Joe Consumer has access to that subscription ChatGPT or free tier, are they really going to run a far-less-powerful model on their laptop? Outside of a few simple tasks like semantic search in your email, notes, photos; or localized transcription, local models will just be too far behind the curve for the public to make much use of them.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: