Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It doesn’t matter how fast you run if it’s not the correct direction.
 help



Good LLM wielders run in widening circles and get to the goal faster than good old school programmers running in a straight line

I try to avoid LLMs as much as I can in my role as SWE. I'm not ideologically opposed to switching, I just don't have any pressing need.

There are people I work with who are deep in the AI ecosystem and it's obvious what tools they're using It would not be uncharitable in any way to characterize their work as pure slop that doesn't work, buggy, untested adequately, etc.

The moment I start to feel behind I'll gladly start adopting agentic AI tools, but as things stand now, I'm not seeing any pressing need.

Comments like these make me feel like I'm being gaslit.


We are all constantly being gaslit. People have insane amounts of money and prestige riding on this thing paying off in such a comically huge way that it can absolutely not deliver on it in the foreseeable future. Creating a constant pressing sentiment that actually You Are Being Left Behind Get On Now Now Now is the only way they can keep inflating the balloon.

If this stuff was self-evidently as useful as it's being made out to be, there would be no point in constantly trying to pressure, coax and cajole people into it. You don't need to spook people into using things that are useful, they'll do it when it makes sense.

The actual use-case of LLMs is dwarfed by the massive investment bubble it has become, and it's all riding on future gains that are so hugely inflated they will leave a crater that makes the dotcom bubble look like a pothole.


Then where is all this new and amazing software? If LLM can 10x or 100x someones output we should've seen an explosion of great software by now.

One dude with an LLM should be able to write a browser fully capable of browsing the modern web or an OS from scratch in a year, right?


That's a silly bar to ask for.

Chrome took at least a thousand man years i.e. 100 people working for 10 years.

I'm lowballing here: it's likely way, way more.

If ai gives 10x speedup, to reproduce Chrome as it is today would require 1 person working for 100 years, 10 people working for 10 years or 100 people working for 1 year.

Clearly, unrealistic bar to meet.

If you want a concrete example: https://github.com/antirez/flux2.c

Creator of Redis started this project 3 weeks ago and use Claude Code to vibe code this.

It works, it's fast and the code quality is as high as I've ever seen a C code base. Easily 1% percentile of quality.

Look at this one-shotted working implementation of jpeg decoder: https://github.com/antirez/flux2.c/commit/a14b0ff5c3b74c7660...

Now, it takes a skilled person to guide Claude Code to generate this but I have zero doubts that this was done at least 5x-10x faster than Antirez writing the same code by hand.


Ah, right, so it's a "skill issue" when GPT5.3 has no idea what is going on in a private use case.

Literally yes

I still haven’t seen those mythical LLM wielders in the wild. While I’m using tools like curl, jq, cmus, calibre, openbsd,… that has been most certainly created by those old school programmers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: