Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've a limited understanding of the topic, but would this allow to run an LLM in a mobile phone in offline mode? If that's feasible, it'd pave the way to lots of interesting applications, such as AI-assisted content moderation without having to phone back confidential data.


Yes, this may (significantly) improve that. Even without that, you can run LLMs already on mobile phones, the question is just how big of a model and how strongly quantized, and if the few models that remain produce good enough results.

E.g. there was a GH Discussion about running LLMs on Apple A-series chips (iPhone) posted here yesterday: https://news.ycombinator.com/item?id=38703161


Yes, the goal is at the end to run larger models on the phone as phones have very limited DRAM.


I'm not sure but I think that's a selling point of the new pixel




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: