Hacker Newsnew | past | comments | ask | show | jobs | submit | teaearlgraycold's commentslogin

I’ve been having a lot of fun with the Pi Pico 2W. It can host an access point, a web server, be a USB host, and of course has GPIO. And not running an OS means it’s way simpler.

It’s an incredibly lopsided machine. The Pi 5 is decently powerful, but you really really should not be attempting to use one as a desktop replacement. While theoretically possible you are so much better off with a $50 used SFF PC.

I don't think Claude Code offers no thinking as an option. I'm seeing "low" thinking as the minimum.

Even with Opus I don’t usually hit limits on the standard plan. But I am not doing professional work at the moment and I actually alternate between using the LLM and reading/writing code the old fashioned way. I can see how you’d blow through the quota quickly if you try to use LLMs as universal problem solvers.

Anthropic is not building good will as a consumer brand. They've got the best product right now but there's a spring charging behind me ready to launch me into OpenCode as soon as the time is right.

Would you use Opus if you switched to OpenCode?

I'd like to use Opus with OpenCode right now to combine the best TUI agent app with the best LLM. But my understanding is Anthropic will nuke me from orbit if I try that.

You can use Opus with OpenCode anytime you want, just not with the Claude plan. You can use it via API with any provider, including Anthropic's API. You can use it with Github Copilot's plan. The only thing you can't do without getting banned is use OpenCode with one of Claude's plans.

I keep seeing this "you can use the inconvenient and unpredictably costly way all you want" pedantic kneejerk response so often lately.

It's like saying well humans can fly with a paraglider. It is correct and useless. Most here won't have cash to burn with unbounded opus api usage.


If you want to use Opus with a different coding harness along with a coding plan, you can use Github CoPilot. It even has built in authentication with OpenCode.

OpenCode with a Copilot Business sub and Opus 4.6 as the model works well

I'm looking at their plans (https://github.com/features/copilot/plans) it seems like the limits might be pretty low, even with the Pro+ plan which is 2x the cost of Claude Pro. It seems like Claude Pro might be 10-20x the Opus tokens for only twice the price.

Copilot has a totally different billing model. It's request based rather than token based. Counter-intuitively, in our case at least, it is way cheaper than token based pricing. One request can sometimes consume 2-4 million tokens but is billed as a single request (or it's multiplier if using a premium model like opus).

I wouldn’t be caught dead with less than 200MB of cache in my desktop in 2026.

> The fix is obvious: work on something else while Claude runs.

Disagree. The fix is actually counter-intuitive: give Claude smaller tasks so that it completes them in less time and you remain in the driver's seat.


Anyone who ships a k8s cluster to make a JS library available over RPC needs to have a long hard look in the mirror. Should have bundled node, quickjs, anything into the go nodes for the first pass. k8s truly is a cancer for many teams.

Also these people are using the memory features. In technical circles I’ve seen people made fun of for having it enabled. It’s considered “cringe”.

Anyone who lets the word "cringe" affect their thoughts or behavior needs to learn to think for themselves.

I think this is a incremental case of Poe's Law. I use the quotation marks to indicate a degree of tongue-in-cheek humor. But yes there's social pressure against using LLM providers' memory features.

Why and how so? Are people watching your LLM sessions?

Not far off from SF rates TBH.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: