Multi-billi9m industry sure, but it doesnt justify near trillions of spending and hype.
AI is slightly behond search engines/internet in terms of usefulness, and even that was bubble, and the sad thing is we already had a few AI wonters before simply due to the cost of research, if they end up losing money, or worse we end uo in a recession because of AI hype, research and funding are gonna dry up ridiculously fast, which wouod hurt not only AI but most of the tech industry short term.
Well at least the us has recovered enough to absorb that fall, i hope.
Well uh, I dont know if this is bad news or good news but GPT-5 might never be released, their "Orion" model seems to be barely better if not worse than GPT-4.
Open AI is already running a huge loss currently, and it isnt like o1 is cheap to run either.
I wonder what price these llm's can be run in order to be profitable,and whether just running your own model would be worth it.
Maybe it could be even cheaper if youre willing to fall behind on R&D, but keep in my mind that everytime openAI invented something, it was quickly copied by its competitors.
I think it will get cheaper when Nvidia has a lot of competition. Currently, Nvidia charges a huge markup, and there exists much room to optimize this expense. I believe OpenAI already has plans to move to other hardware, at least partially.
But why would you spend billions training an llm, when it barely shiws any improvement over the previous model.
Unless OpenAI is willing to do something desperate, the best they have right now is what llm are, and so the cost would be in maintaing them. If you already paid for a bunch of H100's to train, there is little incentive to move away unless you know TPU are going to be significantly cheaper to run, cheap enough to explain the new cost of buying them.
This is ignoring the giant bubble that has balooned out of AI hype, which if popped would be disastorous for the comapnies most invested in the industry. Nvidia has a P/E ratio of 60-70, if they dont get enough future growth to explain it, they could lose a third of their pricing if not more.
A lot of the top researchers are working on making LLMs more capable, so it's not impossible for new breakthroughs to occur, they might not just be as rapid paced as the last two years or so.
There's also lots of utility to be found with the best LLMs today. I'm working on something myself, and have seen others pushing the boundaries in hackathons and startups. So that's a lot of innovation and value that's definitely not a bubble.
The thing with AI isnt that it doesn't have a whole lot of use cases,it is thr fact that its use cases dont justify the hype or investment it has gotten.
It is somewhere in between Crypto and dot.com as far as bubbles go, and as open models catch up with closed source ones, it seems less and less likely that yhese companies are ever gonna make back the monry they spent, at least short term.
The thing with AI isnt that it doesn't have a whole lot of use cases,it is thr fact that its use cases dont justify the hype or investment it has gotten.
The truly disheartening aspect is the fact that so many technical "leaders" have bought into the hype and are throwing huge sums of money down this rabbit hole. It's not like we haven't been through this sort of thing before.
And many seem convinced that these talking databases that can't do simple math are somehow a threat to society. This is an insult to humanity in more ways than one.
They are already bleeding money from what I'm aware, so they either have to jack up prices so much they could make up for 5bn in losses, or Sam Altman manages to convince investors to contribute to another funding round.
Considering he was willing and Delusional enough to ask for 7T USD for AI chips, Im sure he would try.
My guess as the other commenters have mentioned they will start building integrations with CRMs, ready to deploy RAG apps for enterprise knowledge bases, etc. There is still money to be made. How much that I do not know.
Since I have been using the GPT3 API for some work stuff since 2021, I remember that prices had dropped dramatically when GPT3.5Turbo came out. Now they are engaging in what I presume is price wars with Google and Anthropic. Already Anthropic has a higher price, even for its Haiku model for what they called "increased intelligence" but it doesn't beat 4o-mini in benchmarks.
While There is money to be made in further integrations, hell thats where i see most of thr priductivity increase from these tools coming from, OpenAI has already spent billions developing these tools, money which they have to pay back in some way soon.
This is also ignoring the giant elephant that is open models, soon enough models like Llama, would be able to match or even surpass what ChatGPT, by which point why would any sufficently large company pay for API when they can run their own model, especially when all those GPU used for training flood the market.
But then again, plenty of large comapnies still use aws, even when it makes no sense to go serverless, so they might have a market to capitalise on.
We live in interesting times for tech, moores law is dead, intel is falling, layoffs are everywhere...
I sure picked the best time to go to university for Computer Science T-T
Those GPUs also cost a bomb to run. LLMops isn't super easy, I am working with a large OEM manufacturer rn as a consultant and they are also experimenting internally with LLMs, but they have enough resources to run those models. I don't see smaller companies having enough resources to experiment with various models at scale like they are.
>I sure picked the best time to go to university for Computer Science T-T
Going to be a bit contrarian here – while it's true that jobs can be tough to find and layoffs are discouraging, I genuinely believe it's also one of the most exciting times to be in the CS field. The fact that we can actually talk with an "algorithm" is still quite bonkers to me cause I remember fiddling with RNNs and LSTMs just to predict the next word or two in a sentence. There are still ways that we can leverage it to make something really cool. Perplexity is one. Phind is another. Notions's AI integrations are great.
I graduated about four years ago and faced my own setbacks, including getting laid off from my first job. But despite those hurdles, I've managed to find my footing and am doing reasonably well now.
Just hang in there champ.
>Those GPUs also cost a bomb to run. LLMops isn't super easy, I am working with a large OEM manufacturer rn as a consultant and they are also experimenting internally with LLMs, but they have enough resources to run those models. I don't see smaller companies having enough resources to experiment with various models at scale like they are.
True, I sorta conflated running llama on your pc with what large comapnies.
Not to mention how I was somewhat conflating chat-gpt the product with OpenAI the company, What i argues was that soon enough ChatGPT itself won't be that special when comparing it to open source models.
OpenAI the company is in the weird position of both having a moat, and yet drowning in it: They have a huge advantage in skilled experts, engineers, and know-how to get a first mover advantage, especially now that they are practically another subsidiary of microsoft.
But they also have the notable disadvantage of spending billions upon billions of dollars developing a model that in the end is little to no better than what one could get for free from the internet.
A small company with a few dozen specialist could present a comparable product at a fraction of the cost, simply by not having to pay back the cost of developing their own model.
I feel like OpenAI would end up in a weird place in soon, maybe something like a cloud provider for companies, usefull for smaller ones where brand recognition and reliability matter, but having to compete with more specialised companies offering a similar service using llama, And at some point large companies could just build their own servers with open-source LLM's with their own servers and their own teams, bypassing OpenAI entirely.
The biggest winner here is those new small AI consulting teams that didn't have to spend nearly as much on finetuning the models that are already made.
You probably know way more about these things than me, what do you think of this prediction?
It doesnt sound as terrible for developers as I first thought, though it pains me to see how many people quit/never went into software development due to the AI hype, we lost a third of our class from 2023, and I assune things are even worse in america/developed countries.
OpenAI's position is indeed paradoxical. They have a considerable lead in terms of expertise and infrastructure, yet that very advantage comes with the burden of their substantial development costs. A small, nimble company leveraging open-source models can undoubtedly provide some competitive pressure by offering similar capabilities at a lower cost.
Despite these challenges, I believe OpenAI has strategic avenues to sustain and grow. Their investments in integrations, enterprise solutions, and reinforcing the reliability and scalability of their models can maintain their edge. The trust and infrastructure they offer might still be appealing enough for many businesses to stick with them, similar to the AWS analogy you mentioned.
As for the job market and the future for developers, I see your point. The AI hype has indeed introduced some volatility. However, I am cautiously optimistic. The evolution of AI and its integration into various fields will eventually balance out, creating new opportunities even as it displaces others. I still believe we’re in a transformative period where mobility and adaptation within the CS field could lead to exciting new prospects.
To your last point, it’s indeed tough to see talented individuals shy away from software development due to the current uncertainties. However, I hope this phase will pass and those who remain will likely find themselves at the forefront of some groundbreaking developments. Let's hope our lord and saviour, J-Pow has many more rate cuts for us in the future.
It is just the kind of scifi trope that wouod get investors invessting.
OpenAI barely has any moat, I Seriously wonder why no one is talking about hoq much open weighted models are getting to the closes source ones, they need invesotr cash to build something that would let them stabd out, otherwise no one would bither paying for OpenAI API when you can just run your own model.
Another thing this paper mentions is how Open models like llama are quickly closing in on Closed source ones.
I wonder what kind of future llm have, since soon with a good enough GPU you can run your own model that is as good as the paid one(which i expect the price of to rise considerably once investment dries up)
One option is running ads i guess, though i question how that would even work.
Does Meta pay social media content creators outside of when bootstrapping reels against the tiktok threat? They are a middleman for some other payments taking a cut of them but not making the payment itself.
If you get a big profitable following you even start having to pay them for boosts to keep it.