Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Everyone keeps arguing that AI is not Apple’s core business and that their priorities are different. From an end-user perspective, that is irrelevant.

What users actually experience is this: every other major platform is shipping increasingly capable intelligent assistants. These systems can interpret intent, execute multi-step actions, and meaningfully reduce friction. Meanwhile, Siri still struggles with fairly basic workflows.

At the end of the day, I do not particularly care about internal constraints, organizational structure, privacy positioning, or strategic rationale. What matters is whether the product works.

Today, I still cannot reliably:

- Dictate complex voice input without constant correction

- Use voice to control my iPhone in a composable way such as “open this contact and send a message,” “replay the song I liked yesterday,” or “create a note in Obsidian with this content: …”

- Chain actions together in a way that reflects actual user intent

These are not futuristic requests. They are practical, everyday workflows that competitors are increasingly able to handle.

The gap is no longer about incremental feature parity. It is about whether Apple can deliver a genuinely intelligent interface layer, or whether Siri remains a deterministic command parser in an era where users expect contextual reasoning.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: