The (economic) AI apocalypse is nigh(pluralistic.net)
52 points by rendx 8 hours ago | 76 comments
- AznHisoka 5 hours agoEither the AI hype will slow down and the market will crash.
Or AI really does have 100x productivity gains and fewer humans are needed. And you lose your job.
I dont see a positive between any of these scenarios…
- Dumblydorr 7 hours agoWhere’s the evidence, this writing strikes as purely belief based? For all the overhypers of AI, you also get extreme skeptics like this, and neither side has good evidence. It’s speculation. If he truly knew the future he’d short all the companies right before collapse.[-]
- jstanley 7 hours agoHere's some evidence:
Oracle's share price recently went up 40% on an earnings miss, because apart from the earnings miss they declared $455b in "Remaining Performance Obligations" (which is such an unusual term it caused a spike in Google Trends as people try to work out what it means).
Of the $455b of work they expect to do and get paid for, $300b comes from OpenAI. OpenAI has about $10b in annual revenue, and makes a loss on it.
So OpenAI aren't going to be able to pay their obligations to Oracle unless something extraordinary happens with Project Stargate. Meanwhile Oracle are already raising money to fund their obligations to build the things that they hope OpenAI are going to pay them for.
These companies are pouring hundreds of billions of dollars into building AI infrastructure without any good idea of how they're going to recover the cost.
[-]- fabian2k 6 hours agoI'm slightly confused about how solid the expected revenue has to be to be counted as RPO. Does this mean OpenAI actually signed a contract binding them to spend those 300 billions with Oracle?
The second interesting part is also the part you're assuming in your argument. Does the fact that OpenAI doesn't have 300 billions now and neither has the revenue/profit to generate that much matter? Unless there are deals in the background that already secured funding, this seems very shady accounting.
[-]- jstanley 5 hours agoIf I earnt £10k a year from my job, and I was spending more than £10k a year getting myself deeper in debt every year, I wouldn't go out and sign up for £300k of goods and services. But maybe that's just me.
I guess we'll find out.
- HAL3000 6 hours agoIt's a case of major FOMO. They would rather burn with the others who bet wrong than be the ones left behind.
- mjburgess 7 hours agoPre-banking 30 years of a customer's net revenue is eron-level accounting[-]
- fainpul 6 hours ago> eron-level
Enron?
[-]- xnx 6 hours agoElon?[-]
- votepaunchy 5 hours agoHas Elon taken down one of the five major accounting firms?
- ajross 3 hours ago> These companies are pouring hundreds of billions of dollars into building AI infrastructure without any good idea of how they're going to recover the cost.
Well... to be fair it's only really Anthropic (and the also-ran set like xAI) that runs the risk of being over-leveraged. OpenAI is backstopped by Microsoft at the macro level. They might try to screw over Oracle, but they could pay the bill. So that's not going to move the market beyond those two stocks. And the other big players is obviously Google which has similarly deep pockets.
I don't doubt that there's an AI bubble. But it won't pop like that, given the size of the players. Leverage cycles are very hard to see in foresight; in 2008 no one saw the insanity in the derivatives market until Lehman blew up.
- sigilis 7 hours agoThere are various links in the article that have more information. Clicking these references will give the evidence for bad unit economics claims and whatnot.
As for predicting the moment, the author has made a prediction and wants it to be wrong. They expect the system will continue to grow larger for some time before collapse. They would prefer that this timeline be abbreviated to reduce the negative economic impacts. He is advising others on how to take economic advantage of his prediction and is likely shorting the market in his own way. It may not be options trading, but making plans for the bust is functionally similar.
[-]- AznHisoka 7 hours agoThe papers he linked all fail to support his claim. The first paper he linked simply counts the mentions of the term “deep learning” in papers. The 2nd surveyed people who lived in… Denmark and tried to extrapolate that to everyone globally
His points are not backed by much evidence
- 0xDEAFBEAD 7 hours agoSide note: If you're going to short an AI company (or really, buy put options, so you don't have unlimited downside exposure), I would suggest shorting NVIDIA. My reasoning is that if we actually get a fully automated software engineer, NVIDIA stock is liable to lose a bunch of value anyways -- if I understand correctly, their moat is mostly in software.
Wile E Coyote sprints as fast as possible, realizes he zoomed off a cliff, looks down in horror, then takes a huge fall.
Specifically I envision a scenario like: Google applies the research they've been doing on autoformalization and RL-with-verifiable-rewards to create a provably correct, superfast TPU. Initially it's used for a Google-internal AI stack. Gradually they start selling it to other major AI players, taking the 80/20 approach of dominating the most common AI workflows. They might make a deliberate effort to massively undercut NVIDIA just to grab market share. Once Google proves that this approach is possible, it will increasingly become accessible to smaller players, until eventually GPU design and development is totally commoditized. You'll be able to buy cheaper non-NVIDIA chips which implement an identical API, and NVIDIA will lose most of its value.
Will this actually happen? Hard to say, but it certainly seems more feasible than superintelligence, don't you think?
[-]- jstanley 5 hours agoNVIDIA is like the only company making money on the AI bubble, they're not the one I would choose to short.
Tesla is currently trading at 260x earnings, so to actually meet that valuation they need to increase earnings by a factor of 10 pretty sharpish.
They're literally not going to do that by selling cars, even if you include Robotaxis, so really it is a bet on the Optimus robots going as well as they possibly can.
If they make $25k profit per Optimus robot (optimistic) then I think they need to sell about a million per year to make enough money to justify their valuation. Of a product that is not even ready to sell, let alone finding out how much demand their truly is, ramping up production, etc.
For comparison the entire industrial robot market is currently about 500k units per year.
I think the market is pricing in absurdly optimistic performance for Tesla, which they're not going to be able to meet.
(I have a tiny short position in Tesla).
- stuartjohnson12 7 hours agoIf today you told me all this about Enron or FTX while they are still an industry darling, I for one wouldn't want to bet against them. For every FTX where cooked books leads to epic failure, there is a Tether where cooked books lead to accidentally unwrapping an unlimited money tap through all sorts of dubious means.
- trumbitta2 7 hours agoNot an expert, but I'm convinced they will all pivot to military applications before they go bankrupt, and that will unleash a whole new type of hell[-]
- AznHisoka 7 hours ago>> A much-discussed MIT paper found that 95% of companies that had tried AI had either nothing to show for it, or experienced a loss
The paper they linked to just analyzed how many times “deep learning” appears in academic papers…
This is the proof that most companies unsuccessfully tried AI?
[-]- blackbear_ 7 hours agoThe link is wrong, I believe they meant to link this one: https://mlq.ai/media/quarterly_decks/v0.1_State_of_AI_in_Bus...
- JimDabell 7 hours agoI think they linked to the wrong thing. It’s been discussed several times here:
- karlkloss 7 hours agoThere's always somebody predicting an apocalypse. This guarantees that, regardless of what happens, there's somebody who can claim they were right.
Which makes those predictions completely useless. You could as well read your horoscope.
[-]- kemitchell 34 minutes agoYou've given the argument from fallacy, dismissing an argument without any reference to its content, but only to its conclusion. The existence of bad arguments for a proposition doesn't condemn all other arguments for the same.
The question is whether any particular argument's strong or weak. That's a matter of evidence and reasoning.
- newyankee 5 hours agoThe funny thing though is that on balance I believe that since the launch of GPT to the public almost 3 years back, I feel GPT has delivered more than it has disappointed. I still feel that we should run into limitations soon and may be GPT5 is an example of that.
Surely the tech still has a long way to go and keep improving especially as the money has attracted everyone to work on it in different ways that were not considered important till now but the financial side of things have to correct a bit for healthy growth
- AndrewOMartin 4 hours agoMarkets can remain irrational longer than you can remain solvent.
- Havoc 7 hours agoI’m not convinced Unit economics is the right lens here given that it’s a general purpose technology.
For the very near term perhaps but the large scale infra rollouts strike me as a 10+ year strategic bet and on that scale what matters is whether this delivers on automation and productivity
[-]- TheOtherHobbes 7 hours agoThey're all going to start selling ads, obviously.[-]
- Havoc 5 hours agoDoes seem likely for Google though integrating ads into text in a durable way could be challenging.
If it’s overt then it’s easily filtered out, if it’s baked in to deep the it harms response quality
- anon191928 8 hours agoGreat points but timing it can be very hard. It can last many more years because this time they have a thing called "money printer". When crash happens, they will use it.
Yes it prints whatever amount they want, even trillions. Magically(!)
[-]- viking123 7 hours agoMost people, who try to time these, usually get it completely wrong and end up losing huge amounts of money. I just stay invested in the indexes and some long term stocks, every time I try to predict something it goes badly.[-]
- Quarrelsome 7 hours agoBear in mind that your index becomes more and more of those six or seven companies, the more they grow. I think they're over 30% of the market? So an index tracker is still greatly exposed to this.[-]
- nemomarx 6 hours agoI wish I could get an index without them but it would probably have no growth basically - the rest of the market is struggling in comparison, right?
- rsynnott 7 hours ago> When crash happens, they will use it.
You're suggesting that _governments_ will bail out the AI industry? I mean, why would they do that?
[-]- nemomarx 7 hours agoI could see governments having a strong interest in bailing out Nvidia, Microsoft, etc at least?[-]
- XorNot 7 hours agoWhy would Nvidia be in trouble? They're selling shovels during a gold rush, and have slow boated scale up - they're in no trouble.[-]
- tyleo 7 hours agoAgreed, I also feel like Microsoft is diversified enough that this would not bring them down.
Probably the hoards of startups would be most impacted. It isn’t clear the government would bale them out.
[-]- Ekaros 7 hours agoCompanies like Nvidia. Microsoft, Amazon and Google are going nowhere. Just their valuations will in my opinion take massive dip. Which then will have all sorts of other effects.
They are not going to zero, but they can lose lot from the current price.
[-]- tyleo 7 hours agoYeah, I agree with this. Maybe it’s not obvious from my point. I suspect companies will have RIFs and hair cuts but not need bail outs.
- HighGoldstein 7 hours agoNvidia might have a secondary crash if cheap GPUs flood the market. Or we get a resurgence of crypto mining, who knows.
- nemomarx 7 hours agoIf demand for compute and cuda dropped suddenly would they be okay going back to selling graphics cards?[-]
- XorNot 7 hours agoTheir profitability would shrink, but they'd only be in trouble if they were taking on debt to expand operations on the expectation of future growth. AFAIK one of the annoyances gamers have had with Nvidia is that after crypto and now with AI, they've generally been very careful to control how they expand production since they seem quite aware the party could stop any time. It certainly helps to have a lot of product lock-in - people will bear much higher prices to stay with Nvidia at this point (due to, as noted - CUDA).
Sure, the stock price wouldn't be to the moon anymore, but that doesn't materially effect operations if they're still moving product - and gaming isn't going anywhere.
The stock price of a company can crash without it materially effecting the company in anyway...provided the company isn't taking on expansion operations on the basis of that continued growth. Historically, Nvidia have avoided this.
- tyleo 7 hours agoWell they sell graphics cards now so yes. Why would they suddenly not be okay with selling graphics cards if the AI bubble popped?
- surgical_fire 7 hours ago"If we won't build it, China will"
I am sure that you already heard this sort of argument for AI. It's a way to bait that juicy government money.
[-]- Quarrelsome 7 hours agoIts way too expensive. We bail out banks so the little people dont lose their shirts too. There's no equivalent in AI.[-]
- anon191928 6 hours agoyou are missing the point, once AI companies goes down it will take down the sp500 too. so their retirement accounts will be affected.[-]
- Quarrelsome 6 hours agoI think there's a bit of a difference between preventing a run on the banks and propping up the entire stock market for the sake of just a handful of companies that all have big enough pockets to fail.[-]
- anon191928 6 hours agocovid crash and what they did in respond to market crash disagrees.[-]
- Quarrelsome 4 hours agowhat are you referencing in particular?[-]
- anon191928 3 hours ago[-]
- Quarrelsome 2 hours agosorry, wasn't that in response to a natural disaster? I don't see how we so easily sweep that into the same idea as "bail out".
- tesdinger 7 hours ago3% of the world GDP is more than the 2 trillion needed to fund AI
- TYPE_FASTER 5 hours agoThe AI bubble feels similar to the Dot.com bubble. The pattern is the same, right? There's some tech, a hype wave that people try to surf (if you paddle fast enough, you can start something and cash out in time), and when the water recedes before the next wave people/organizations either rode the wave cashed out and walked off the beach to their next thing, didn't make the wave and will try next time, or got pounded and their board broken. Or they ripped a great line, did a backflip like Medina, threw their hands in the air, and paddled back out to catch the next wave.
But you don't always know what the wave is going to look like when it's building. You just know there's a wave, and either you get on it or you don't. The connectivity of the web was obvious, the monetization wasn't super obvious (remember K-Tel anybody?). The power of modern AI is obvious, the monetization and final form of the tech isn't. I mean, using natural language is cool and all, and I think there is a lot of value in using models to help/assist engineering and other work, but I think it's too soon to say what the end game is.
- roxolotl 7 hours agoI want the hype to die and the bubble to pop as much as Ed and Cory and everyone else writing about it but right now it’s just them basically recycling the same bad news and posting about it. I’d love to see some writing which looks at the factors which caused previous pops and to line them up with factors today to try and determine what’s actually coming. Clearly the market is irrational as hell right now but seemingly very little is going to change that. The closest I’ve seen to what I’m looking for is the coverage over at Notes on the Crises[0] and he also seems bewildered.
Edit: I found this piece which does look at historical bubbles/market events and tries to discuss how they plan out in terms of timing [1]
0: https://www.crisesnotes.com/ 1: https://paulkrugman.substack.com/p/why-arent-markets-freakin...
- jcattle 7 hours agoCan anyone give more than a hand-waived explanation on how this crash will come about? This paragraph reads kind of like: Companies not profitable, no more money coming in, ????, crash
>> I firmly believe the (economic) AI apocalypse is coming. These companies are not profitable. They can't be profitable. They keep the lights on by soaking up hundreds of billions of dollars in other people's money and then lighting it on fire. Eventually those other people are going to want to see a return on their investment, and when they don't get it, they will halt the flow of billions of dollars. Anything that can't go on forever eventually stops.
How will this actually lead to a crash? What would that crash look like? Are banks going bust? Which banks would go bust? Who is losing money, why are they losing money?
[-]- hyperbovine 7 hours agoSee comments elsewhere in this thread. To cite one well known recent example, Oracle stock went crazy recently (and Larry Ellison briefly became the world’s richest person) after they disclosed in their earnings report that they are expecting something like $400b in revenue from serving OpenAI in the coming years. These overinflated expectations systematically multiply and propagate until you’ve arrive at the situation we’re in. As soon as that does _not_ happen and everyone realizes it, the whole house of cards come crashing down, in a very 1929 sort of way.
This is the point TFA is making, albeit a bit hyperbolically.
[-]- piva00 6 hours agoIt also happened in the dotcom bubble, telecom companies were providing leasing/loans to their customers to purchase more networking equipment.
Very similar to the circular funding happening between Nvidia, and their customers, Nvidia funding investments in AI datacenters which get spent in Nvidia equipment, each step of the cycle has to take a cut for paying their own OpEx where the money getting back to Nvidia diminishes in each pass through the cycle.
- _ZeD_ 6 hours agoeveryone will lose money.[-]
- jcattle 6 hours agoHow? Everyone everyone?
How about some guy not invested in the stock market, building a house and working as a plumber be impacted?
[-]- nemomarx 6 hours agoIf his clients lose all their money in stocks or lose their jobs he also loses work. Depressions tend to impact everyone in the end as unemployment gets high enough
- ToucanLoucan 7 hours ago> What would that crash look like?
What it usually looks like when one of the valley's "revolutions" fails to materialize: a formerly cheap and accessible tech becomes niche and expensive, acres of e-waste, the job market is flooded with applicants with years of experience in something no longer considered valuable, and the people responsible sail off into the sunset now richer for having rat fucked everyone else involved in the scheme.
In this case though given the sheer scale of the money about to go away, I would also add: a lot of pensions are going to see huge losses, a lot of cities who have waived various taxes to encourage data-center build outs are going to be left holding bags and possibly huge, ugly concrete buildings in their limits that will need to be destroyed, and, a special added one for this bubble in particular, we'll have a ton of folks out there psychologically dependent on a product that is either priced out of their ability to pay or completely unavailable, and the ensuing mental health crises that might entail.
[-]- jcattle 6 hours agoIsn't the thing that costs everyone an arm and a leg at the moment the race for better models? So all of the training everyone is doing to get SOTA in some obscure AI benchmark? From all of the analysis I've read, inference is quite profitable for the AI companies. So at least for the last part:
> we'll have a ton of folks out there psychologically dependent on a product that is either priced out of their ability to pay or completely unavailable, and the ensuing mental health crises that might entail.
I doubt that this will become true. If there's one really tangible asset these companies are producing, which would be worth quite a bit in a bankruptcy it's the model architectures and weights, no?
[-]- ToucanLoucan 6 hours ago> Isn't the thing that costs everyone an arm and a leg at the moment the race for better models? So all of the training everyone is doing to get SOTA in some obscure AI benchmark? From all of the analysis I've read, inference is quite profitable for the AI companies.
From what I've read: The cost to AI companies, per inference as a single operation, is going down. However, all newer models, all reasoning models, and their "agents" thing that's still trying desperately to be an actual product category all require magnitudes more inferences per request to operate. It's also worth noting that code generation and debugging, which is one of the few LLM applications I will actually say has a use and is reasonably good, also costs far more inferences per request to operate. And that number of inferences can increase massively with a sufficiently large chunk of code you're asking it to look at/change.
> If there's one really tangible asset these companies are producing, which would be worth quite a bit in a bankruptcy it's the model architectures and weights, no?
I mean, not really? If the companies enter bankruptcy that's a pretty solid indicator that the models are not profitable to operate, unless you're envisioning this as like a long-tail support model that you see with old MMO games, where a company picks up a hugely expensive to produce product, like LOTRO, and runs it with basically a skeleton crew of devs and support folks for the handful of users who still want to play it, and eeks out a humble if legitimate profit for doing so. I guess I could see that, but it's also worth noting that type of business has extremely thin margins, and operating servers for old MMO games is WAY less energy and compute intensive than running any version of ChatGPT post 2023.
Edit: Also, worth noting specifically in the case of OpenAI are it's deep and OLD ties to Microsoft. Microsoft doesn't OWN OpenAI in any meaningful sense, but it is incredibly core to it's entire LLM backend. IMO (not a lawyer) if OpenAI were to go completely belly up, I'm not even sure the models would go to any sort of auction, unless Microsoft was ready to just let them do so. I think they'd devour whatever of the tech stack is available whole without even really spending too much, if any, money on it and continue running it as is.
- ChrisArchitect 4 hours agoEarlier Discussion: https://news.ycombinator.com/item?id=45399893
- shubhamjain 7 hours ago> I firmly believe the (economic) AI apocalypse is coming. These companies are not profitable. They can't be profitable. They keep the lights on by soaking up hundreds of billions of dollars in other people's money and then lighting it on fire.
This is what I don't like. Debating in extremes. How can AI have bad unit economics? They are literally selling compute and code for a handsome markup. This is classic software economics, some of the best unit economics you can get. Look at Midjourney: it pulled in hundreds of millions without raising a single dime.
Companies are unprofitable because they are chasing user growth and subsidising free users. This is not to say there isn't a bubble, but it's a rock-solid business and there to stay. Yes, music will stop one day, and there will be a crash, but I’d bet that most of the big players we see today will still be around after the shakeout. Anecdote: my wife is so dependent on ChatGPT that if the free version ever stopped being good enough, she’d happily pay for premium. And this is coming from someone who usually questions why anyone needs pay for software.
[-]- fluidcruft 5 hours agoGenerally I think the question is whether they actually are selling it at a markup. Inference is easier to reason about. I think the problem is financing training. The issue is the future value of inference seems to be being massively overstated to justify present day expenditures on training (particularly since the value of training today evaporates extremely quickly)
- whstl 7 hours ago> Further: the topline growth that AI companies are selling comes from replacing most workers with AI, and re-tasking the surviving workers as AI babysitters ("humans in the loop"), which won't work. Finally: AI cannot do your job, but an AI salesman can 100% convince your boss to fire you and replace you with an AI that can't do your job
This hits home. A lot of the supposed claims of improvements due to AI that I see are not really supported by measurements in actual companies. Or they could have been just some regular automation 10 years ago, except requiring less code.
If anything I see a tendency of companies, and especially AI companies, to want developers and other workers to work 996 in exchange for magic beans (shares) or some other crazy stupid grift.
[-]- AznHisoka 7 hours agoSo what metric would you look at that would support the idea that AI is inproving a company?[-]
- whstl 6 hours agoI guess anything other than just claims from people that have a stake in it?
If companies are shipping AI bots with a "human in the loop" to replace what could have been "a button with a human in the loop", but the deployment of the AI takes longer, then it's DEFINITELY not really an improvement, it's just pure waste of money and electricity.
Similarly, what I see different from the pre-AI era are way too many SV and elsewhere companies having roughly the same size and shipping roughly the same amount of features as before (or less!), but are now requiring employees to do 996. That's the definition of loss of productivity.
I'm not saying I hold the truth, but what I see in my day to day is that companies are masters at absorbing any kind of improvement or efficiency gain. Inertia still rules.
[-]- AznHisoka 5 hours agoSo would a lower headcount, stable/improving revenue be a metric you would look at?
- adrianbooth17 6 hours agoEvery day I see articles on HN discussing the AI bubble potentially crashing. The large number of such articles appearing daily is increasing my confidence that the AI space will be fine.
- petesergeant 7 hours ago> and when the bubble bursts, the money-hemorrhaging "foundation models" will be shut off
This is not a serious piece of writing.
[-]- baal80spam 7 hours agoOf course it isn't. It's a FUD piece.