Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yea, it's going to be interesting to see how this all unfolds.

In my opinion, the future of software development looks like this:

1) Human developer instructs A.I. bots to build large volumes of code (like Second)

2) Human developer uses A.I. code assistants in the IDE to make "low level" changes or additions

I do not believe that developers or engineers will be replaced. The world needs more software, and there's not enough of us to build it all. Quite frankly, we need the help of bots to meet the demand.



No/low-code not working this time either is not a surprise. All these attempts have been made several times throughout the history. Also, as Fred Brooks famously said there is no silver bullet: no/low-code promised to be a silver bullet while only removing the accidental complexities (again, Brooks' terminology).

AI changes this, however. The scenario you are describing assumes that AI will get stuck, as I said. Otherwise I don't see why we would still need humans to supervise it. But if it can write good enough software without expensive human beings making corrections left and right, it will be able to do a lot of the jobs that we're trying to make more efficient (or even possible) with all this software we're writing. E.g. you don't really need GitHub (Jira, etc.) if you don't have a lot of developers cooperating. And this is true for a lot of white collar professions that we're creating these heaps of web apps for, that generate all that demand that you are mentioning.

If we get stuck (and, BTW, have some time to figure out how to live with AI) then sure, we'll have programmers (and non-programmers with good analytical skills) command and supervise AI coders. Everyone and their colleague will turn into a tech-lead/product manager. If.


> No/low-code not working this time either is not a surprise. All these attempts have been made several times throughout the history. Also, as Fred Brooks famously said there is no silver bullet: no/low-code promised to be a silver bullet while only removing the accidental complexities (again, Brooks' terminology).

That's basically my opinion, yeah sure this looks impressive, until you realise that it's basically recreating Wordpress with less modules.

Generating code is far from enough to replace developers, it's the earliest step, even Frontpage itself is 30 years old at this point.


I see that

> The world needs more software, and there's not enough of us to build it all

... is a paraphrasing of the following from your top-level text

> Moreover, the world needs more software than there are engineers to build it all.

... which I'm very skeptical about. In what sense does the world "need" more software?

What exactly is it that "the world" really needs this additional software for, and is it really software that's missing, or can the same need be fulfilled via other means (e.g. better software, or improved processes)?

And to follow-up on that, if the world does need "more software", is there some level of software that is sufficient? I.e. is there a steady state of software/world, or at least, software/person, or are we "maximizing paperclips"?


3) AI bot claims IP rights to code it wrote, but so does the AI company that provided the service, and you end up in an Oracle vs Google vs John Doe scenario.


I absolutely think people wont get hired though, that otherwise would

Instead of looking for layoffs, look for the jobs and payment transactions never happening

Note: this doesn't bother me


> I absolutely think people wont get hired though, that otherwise would

If it is successful, it increases jobs in the field, while shifting skill demands to higher levels of analysis/abstraction. Just like every change improvement in how you program computers since "physically configure wiring or switches".

Possibly some individuals won't get hired in the field that otherwise would have, but the size of the field developing software isn't likely to shrink because of it.


Software has been automating itself for 75 years. Automation cannibalizing itself forever. And yet there are more programmers than ever. AI will just expand the amount of software or create whole new fields on top.


I guess there is more complexity than ever too?

I feel like the early days of programming might've been actually easier to automate. There was often well designed specs, cleaner design and architectural patterns to follow etc. Not in all cases of course however now, I'm seeing people hoping to god that ChatGPT can write a client for the worlds shittiest XML API because no one else sure as hell wants to do it.

So what happens now? We just generate more shit and then use moar AI (tm) to work with moar shit (tm), and evolve that pattern fast and faster?

Yesterday I was using co-pilot and it was suggesting a bunch of obvious auto-completes etc, which is fine, that's why I use it, then it hit me, most people are probably going to start to not bother with libraries, clean APIs and specs because soon you will be just brute forcing your way to a solution using text to code and on we go. Maybe the answer will really just be to keep evolving generative coding AIs to keep up with it, let's see how it goes.


People using AI to generate crap faster is definitely a (short term) risk. And, BTW, not only with code. I read an article stating that in a few years (2-3, can't remember) 80% of content only will be generated by AI. Which would be a disaster. We're already swimming in low quality information and this will only make it worse.

OTOH, so much for the idea that people will guide the hand of AI to create better code. As this will unfold, there will be ever more incentives to remove (most) humans from the loop. And if past can be used to predict the future, what we have seen so far is that when AI gets reasonably competent at something it will gain superhuman capabilities very quickly. I still keep bringing up how people thought that after alpha go beat Fan Hui (European go champion) it would have taken a huge leap for it to beat Lee Sedol. Because Lee Sedol was so much better than Fan Hui. At least in human terms. It only took a few months of training and tinkering for DeepMind.

Speaking of loops: since AIs are taught from the internet (simple language models or coding specialists), we're creating an unwanted feedback loop here. More AI generated content will likely make teaching future AIs harder. At least with information harvested from the internet after ~2022-ish.


There is another way, not all AI training data must be sourced from humans. You can loop a LLM with a compiler and set of tests to run, and it will happily search for solutions on its own. Or it could be connected to a simulator, a game or any real life process and let it learn to optimise a goal that can be measured without human work.


Sure, most software developers prefer to automate as much of our work as possible. Starting with compilers, build systems, reusable software (libraries, frameworks), etc.

But these are fundamentally different from AI. As I mentioned above, Fred Brooks in his essay (The Mythical Man-month) would argue that all these just decrease the accidental complexity. The incidental complexity still remains. I.e. taking a vague description, an idea, something embedded in the heads of the stakeholders and turning it into some kind of actual computer code, finding and removing inconsistencies and with a minimal number of bugs. Now AI will be able to do it one day. And it seems that day is not that far away in the future.

The hard move is going from a vague set of incomplete requirements (which is always the case) to specific, executable code. We never had a tool that can do this transformation. Until now(ish).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: