Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here:
https://storagesearch.com/memory-boom-bust-cycles.html
(That's just me eyeballing it, feel free to do the math)
I am so glad both top rated and majority of comments on HN finally understands DRAM industry instead of constant DRAM is a cartel that is why things are expensive.
Also worth mentioning DRAM and NAND's profit from Samsung is what keep the Samsung Foundry fighting TSMC. Especially for those who thinks TSMC is somehow a monopoly.
Another things to point out which is not mentioned yet, China is working on both DRAM and NAND. Both LPDDR5 and Stacked NAND are already in production and waiting for yield and scale. Higher Price will finally be perfect timing for them to join the commodity DRAM and NAND race. Good for consumer I suppose, not so good for a lot of other things which I wont go into.
The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry, and that strategy requires significant capital expenditures, and the industry does not get (or does not want) outside investment to fund it, and if any of the firms defects and keeps prices low the others cannot execute on the strategy, so they all agree to raise prices.
Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal, and they have to get smacked down by regulators.
> The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry.. Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal..
So said and did the infamous Phoebus cartel, to unnaturally "fix" the prices and quality of light bulbs.
For more than a century, one strange mystery has puzzled the world: why do old light bulbs last for decades while modern bulbs barely survive a couple of years?
The answer lies in a secret meeting held in Geneva, Switzerland in 1924, where the world’s biggest light bulb companies formed the notorious Phoebus Cartel.
Their mission was simple but shocking: control the global market, set fixed prices, and most importantly… reduce bulb lifespan.
Before this cartel, bulbs could easily run for 2500+ hours. But after the Phoebus Cartel pact and actions, all companies were forced to limit lifespan to just 1000 hours. More failure meant more purchases. More purchases meant more profit. Any company who refused faced heavy financial penalties.
The most unbelievable proof is the world-famous Livermore Fire Station bulb in California, glowing since 1901. More than 120 years old. Still alive.
While our new incandescent bulbs die in 1–2 years.
Though the Phoebus cartel was dissolved in the 1930s due to government pressure, its impact still shadows modern manufacturing. Planned obsolescence didn’t just begin here… but Phoebus made it industrial.
The Phoebus cartel didn't collude just to make the light bulbs have a shorter lifespan. They upped the standard illumination a bulb emitted so that consumers needed fewer of them to see well. With an incandescent you have a kind of sliding scale of brightness:longevity (with curves on each end that quickly go exponential, hence the longest lasting light bulb that's so dim you can barely read by its light). The brighter the bulb, the shorter the lifespan.
Also, incandescent bulb lifespan is reduced by repeated power cycling. Not only is the legendary firehouse bulb extremely dim, it has been turned off and back on again very few times. Leaving all your lights on all the time would be a waste of power for the average household, more expensive than just replacing the bulbs every so often.
Also lightbulb dimmers were a thing back in the day, so you could always buy more lightbulbs and lower the brightness of each to take advantage of that exponential curve in lifespan.
Most of us who've been on Earth for a while know that courts often get it wrong. Even if the particular court decision you mention was correct does not mean that price fixing is the main reason or the underlying reason DRAM prices sometime go up.
Historically, yes. But we haven't had historical demand for AI stuff before. What happens when OpenAI and NVIDIA monopolize the majority of DRAM output?
I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.
Do we really think the current level of AI-driven data center demand will continue indefinitely? The world only needs so many pictures of bears wearing suits.
The pop culture perception of AI just being image and text generators is incorrect. AI is many things, they all need tons of RAM. Google is rolling out self-driving taxis in more and more cities for instance.
Congrats on engaging with the facetious part of my comment, but I think the question still stands: do you think the current level of AI-driven data center demand will continue indefinitely?
I feel like the question of how many computers are needed to steer a bunch of self-driving taxis probably has an answer, and I bet it's not anything even remotely close to what would justify a decade's worth of maximum investment in silicon for AI data centers, which is what we were talking about.
No, the 10% best scenario return on AI won't make it. The bubble is trying to replace all human labor, which is why it is a bubble in the first place. No one is being honest that AGI is not possible in this manner of tech. And Scale won't get them there.
The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.
...except current peak in demand is mostly driven by build-out of AI capacity.
Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.
Even if AI stuff does really need DDR5, there must be lots of other applications that would ideally use DDR5 but can make do with DDR3/4 if there's a big difference in price
I mean, AI is currently hyped, so the most natural and logical assumption is that AI drives these prices up primarily. We need compensation from those AI corporations. They cost us too much.
Doesn't the same factory produce enterprise (i.e. ECC) and consumer (non-ECC) DRAM?
If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.
Conceptually, you can think of it as "RAID for memory".
A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.
An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.
Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.
Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.
The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.
Yes, but if new capacity is also redirected to be able to be sold as enterprise memory, we won't see better supply for consumer memory. As long as margins are better and demand is higher for enterprise memory, the average consumer is screwed.
Does it matter that AI hardware has such a shorter shelf life/faster upgrade cycle? Meaning we may see the ram chips resold/thrown back into the used market quicker than before?
I mean, the only difference we care about is how much of it is actual RAM vs HBM (to be used on GPUs) and how much it costs. We want it to be cheap. So yes, there's a difference if we're competing with enterprise customers for supply.
I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.
A LOT of businesses learned during Covid they can make more money by permanently reducing output and jacking prices. We might be witnessing the end times of economies of scale.
The idea is someone else comes in that's happy to eat their lunch by undercutting them. Unfortunately, we're probably limited to China doing that at this point as a lot of the existing players have literally been fined for price fixing before.
It seems more likely that someone else comes in and either colludes with the people who are screwing us to get a piece of the action or gets bought out by one of the big companies who started all this. Since the rare times companies get caught they only get weak slaps on the wrist where they only pay a fraction of what they made in profits (basically just the US demanding their cut) I don't have much faith things will improve any time soon.
Even China has no reason to reduce prices much for memory sold to the US when they know we have no choice but to buy at the prices already set by the cartel.
I expect that if China does start making memory they'll sell it cheap within China and export it at much higher prices. Maybe we'll get a black market for cheap DRAM smuggled out of China though.
I think in part it is a system level response to the widespread just-in-time approach of those businesses' clients. A just-in-time client is very "flexible" on price when supply is squeezed. After that back and forth i think we'll see return to some degree of supply buffering(warehousing) to dampen down the supply levels/price shocks in the pipelines.
In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.
But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.
Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...
Also, current political turbulence makes planning for the long term extremely risky.
Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?
The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.
High risk requires a correspondingly high potential return.
That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.
No, a wafer is very much not a wafer. DRAM processes are very different from making logic*. You don't just make memory in your fab today and logic tomorrow. But even when you stay in your lane, the industry operates on very long cycles and needs scale to function at any reasonable price at all. You don't just dust off your backyard fab to make the odd bit of memory whenever it is convenient.
Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.
* Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.
Sure, yes the cost of producing a wafer is fixed. Opex didn’t change that much.
Following your reasoning, which is common in manufacturing, the capex needed is already allocated.
So, where does the 2x price hike come from if not supply/demand?
The cost to produce did not go up 100%, or even 20%
Actually, DRAM fabs do get scaled down, very similar to the Middle East scaling down oil production.
DRAM/flash fab investment probably did get scaled down due to the formerly low prices, but once you do have a fab it makes sense to have it produce flat out. Then that chunk of potential production gets allocated into DRAM vs. HBM, various sorts of flash storage etc. But there's just no way around the fact that capacity is always going to be bottlenecked somehow, and a lot less likely to expand when margins are expected to be lower.
> Sometimes they're only built if their capacity is essentially sold already.
"Hyperscalers" already have multi-year contracts going. If the demand really was there, they could make it happen. Now it seems more like they're taking capacity from what would've been sold on the spot or quarterly markets. They already made their money.
Well, I've experienced both to some degree in the past. The previous long time with very similar hardware performance was when PCs were exorbitantly expensive and commodore 64 was the main "home computer" (at least in my country) over the latter 80s and early 90s.
That period of time had some benefits. Programmers learned to squeeze absolutely everything out of that hardware.
Perhaps writing software for today's hardware is again becoming the norm rather than being horribly inefficient and simply waiting for CPU/GPU power to double in 18 months.
I was lucky. I built my am5 7950x Ryzen pc with 2x48gb ddr5 2 years ago. I just bought 4x48gb kit a month ago with an idea to build another home server with the old 2*48gb kit.
Today my old g.skill 2x48gb kit costs Double what I paid for the 4x48gb.
Furthermore I bought two used rtx3090 (for AI) back then. A week ago I bought a third one for the same price... ,(for vram in my server).
> It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
But you’re cherry picking prices from a notable period of high prices (right now).
If you had run this comparison a few months ago or if you looked at averages, the same RAM would be much cheaper now.
I think that goes to show that official inflation benchmarks are not very practical / useful in terms of buckets of things that people actually buy or desire. If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
> If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
A house is $500,000
A GPU is $500
You could put GPUs into the inflation bucket and it wouldn’t change anything. Inflation trackers count cost of living and things you pay monthly, not one time luxury expenses every 4 years that geeks buy for entertainment.
Also need to account for the dollar decline vs other currencies (which yes is possibly somewhat factored into dollar inflation so you'd have to do the inflation calculation in Euros then convert to dollars accounting for the decline in value).
I just gave up and built an AM4 system with a 3090 because I had 128G of ddr4 udimms on hand the whole build was for less than just the memory would have cost for an AM5/ddr5 build.
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
If you don't need full ieee-754 double precision, ozaki scheme (emulation with tensor cores) might do the trick. It's been added (just a little bit) to cublas recently.
My 64gb DDR5 kit started having stability issues running XMP a few weeks out of warranty. I bought it two years ago. Looked into replacing it and the same kit is now double the price. Bumping the voltage a bit and having better cooling gets it through memtest thankfully. The fun of building your own computer is pretty much gone for me these days.
Last night, while writing a LaTeX article, with Ollama running for other purposes, Firefox with its hundreds of tabs, multiple PDF files open, my laptop's memory usage spiked up to 80GB RAM usage... And I was happy to have 128GB. The spike was probably due to some process stuck in an effing loop, but the process consuming more and more RAM didn't have any impact on the system's responsiveness, and I could calmly quit VSCode and restart it with all the serenity I could have in the middle of the night.
Is there even a case where more RAM is not really better, except for its cost?
> Is there even a case where more RAM is not really better, except for its cost?
It depends. It takes more energy, which can be undesirable in battery powered devices like laptops and phones. Higher end memory can also generate more heat, which can be an issue.
But otherwise more RAM is usually better. Many OS's will dynamically use otherwise unused RAM space to cache filesystem reads, making subsequent reads faster and many databases will prefetch into memory if it is available, too.
That said, I wholeheartedly agree that "more RAM less problems". The only case I can think of when it's not strictly better to have more is during hibernation (cf sleep) when the system has to write 128GB of ram to disk.
On consumer chips the more memory modules you have the slower they all run. I.e. if you have a single module of DDR5 it might run at 5600MHz but if you have four of them they all get throttled to 3800MHz.
Intel's consumer processors (and therefore the mainboards/chipsets) used to have four memory channels, but around the year 2020 this was suddenly limited to two channels since the 12th generation (AMD's consumer processors had always two channels, with exception of Threadriper?).
However this does not make sense, as for more than a decade the processors have only grown increasing the number of threads, therefore two channels sounds like a negligent and deliberately imposed bottleneck to access the memory if one use all those threads (Lets say 3D render, Video postproduction, Games, and so on).
And if one want four channels to surpass such imposed bottleneck, the mainboards that nowadays have four channels don't contemplate consumer use, therefore they have one or two USB connectors with three or four LAN connectors at prohibitive prices.
We are talking about consumer quad-channel DDR4 machines ten years old, wildly spread, keeps being competent compared with current consumers ones, if not better. It is like if all were frozen along this years (and what remains to be seen with such pattern).
Now it is rumoured that AMD may opt for four channels for its consumer lines due to the increased number of pin connectors (good news if true).
It is a bad joke what the industry is doing to customers.
> Intel's consumer processors (and therefore the mainboards/chipsets) used to have four memory channels, but around the year 2020 this was suddenly limited to two channels since the 12th generation (AMD's consumer processors had always two channels, with exception of Threadriper?).
You need to re-check your sources. When AMD started doing integrated memory controllers in 2003, they had Socket 754 (single channel / 64-bit wide) for low-end consumer CPUs and Socket 940 (dual channel / 128-bit wide) for server and enthusiast destkop CPUs, but less than a year later they introduced Socket 939 (128-bit) and since then their mainstream desktop CPU sockets have all had a 128-bit wide memory interface. When Intel later also moved their memory controller from the motherboard to the CPU, they also used a 128-bit wide memory bus (starting with LGA 1156 in 2008).
There's never been a desktop CPU socket with a memory bus wider than 128 bits that wasn't a high-end/workstation/server counterpart to a mainstream consumer platform that used only a 128-bit wide memory bus. As far as I can tell, the CPU sockets supporting integrated graphics have all used a 128-bit wide memory bus. Pretty much all of the growth of desktop CPU core counts from dual core up to today's 16+ core parts has been working with the same bus width, and increased DRAM bandwidth to feed those extra cores has been entirely from running at higher speeds over the same number of wires.
What has regressed is that the enthusiast-oriented high-end desktop CPUs derived from server/workstation parts are much more expensive and less frequently updated than they used to be. Intel hasn't done a consumer-branded variant of their workstation CPUs in several generations; they've only been selling those parts under the Xeon branding. AMD's Threadripper line got split into Threadripper and Threadripper PRO, but the non-PRO parts have a higher starting price than early Threadripper generations, and the Zen 3 generation didn't get non-PRO Threadrippers.
At some point the best "enthusuast-oriented HEDT" CPU's will be older-gen Xeon and EPYC parts, competing fairly in price, performance and overall feature set with top-of-the-line consumer setups.
Mainboards have two memory channels so you should be able to reach 5600mhz on both and dual slot mainboards have better routing than quad slot mainboards. This means the practical limit for consumer RAM is 2x48GB modules.
If you are working on an application that has several services (database, local stack, etc.) as docker containers, those can take up more memory. Especially if you have large databases or many JVM services, and are running other things like an IDE with debugging, profiling, and other things.
Likewise, if you are using many local AI models at the same time, or some larger models, then that can eat into the memory.
I've not done any 3D work or video editing, but those are likely to use a lot of memory.
Having recently updated to 192gb from 96gb I'm pretty happy. I run many containers, have 20 windows of vscode and so on. Plus ai inference on CPU when 48gb vram is not enough.
Interesting that Samsung put their prices up 60% today, and a retailer who bought their stock at the old price feels compelled to put their prices up 2.5x.
When the AI bubble bursts we can get back to the old price
The cost of inventory on the shelves basically doesn’t matter. The only thing that matters is the market rate.
If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices, so there wouldn’t actually have been weeks of low retail RAM prices for everyone.
Every once in a while you can catch a retailer whose pricing person missed the memo and forgot to update the retail price when the announcement came out. They go out of stock very rapidly.
> If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices
But that retailer would have made a lot of money in a very short time.
In the scenario where they don't raise prices, they sell out immediately. In the scenario where they do raise prices, it's too expensive so you don't buy it. In the scenario where they keep prices low, and do a lottery to see who can buy them, you don't get picked.
No matter what, you are not getting those modules at the old price. There are few things that trip up people harder than this exact scenario, and it happens everywhere. Concert tickets, limited releases, water during crises, hot Christmas gift, pandemic GPUs, etc.
Once understood you can stop getting mad over it like it's some conspiracy. It's fundamental and natural market behavior.
I guess I lucked out. I bought a 768GB workstation (with 9995wx CPU and rtx 6000 Pro Blackwell GPU) in August. 96GB modules were better value than 128GB. That build would be a good bit pricier today looks like.
Yeah you are not alone here being annoyed. I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
Those price increases are not normal at all. I understand that most of it still comes from market demands but this is also skewing the market now in unfair manners. Such increases smell of criminal activity too.
> I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
You want to penalize companies for buying things and penalize companies for selling things are market rate?
There are a lot of good examples through history about how central planning economics and strict price controls do not lead to good outcomes. The end result wouldn’t be plentiful cheap RAM for you. The end result would be no RAM for you at all because the manufacturers choose to sell to other countries who understand basic economics.
I think there's a case for banning the sale of services well below the marginal cost of supplying that service - loss leaders, or "dumping" - when it's done on such a scale as AI marketing.
Such is life. I suggest finding a less volatile hobby, like crocheting.
Actually, the textile market is pretty volatile in the US these days with Joan's out of business. Pick a poison, I guess? There's little room for stability in a privately-owned-world.
I think it's somewhat useful long term advice, and I would add that parts prices tend to be asynchronous.
Building a PC in a cost efficient manner generally requires someone to track parts prices over years, buy parts at different times, and buy at least a generation behind.
The same applies to many other markets/commodities/etc...
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
It's more than the fact they are surrounded by sycophants. It's also that, despite the mythology the executive-worship-industry tries to paint, CxOs and board members of companies are just not very creative or visionary people. They largely spend their time looking at their peers and competitors for hints about what they should be doing. And today, those hints all are "do AI". They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
> They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
I think you're underestimating a bit. We must implement AI because they were able to sell it so good that they got billion $ investors (see all the money coming from Qatar/saudi arabia etc). That's a lot of money coming in that allows to innovate/etc.
But that thing they all were peddling and getting investors over could be anything! For a while it was "blockchain." Everyone had to do blockchain because everyone was doing blockchain, and investors were giving you money if you say blockchain. I wonder what it will be once the AI bubble bursts.
I swear that every 5-10 years, corporate CEOs all get together in a secret meeting where they all agree on the next buzzword technology. They invite Harvard Business Review and the tech press to give them their marching orders. Then, finally, the white smoke comes forth from the chimney indicating the next bubble buzzword has been chosen, and the industry goes bananas over it for 10 years for no reason.
Sounds quite a bit like stock market. The more sober and cynical of them see fads as fads, irrational but powerful movements, and ride the waves, selling to a greater fool.
Out-of-touch leaders existed for millennia. The "Emperor's New Clothes" tale was published in 1837 as a reproduction of a much older folk take. Sima Qian criticizes out-of-touch lords and emperors in his book about ancient history, written in 1th century BC. Maybe there is even older evidence.
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
Sorry, how did she stand in the way of IPOs? She was against the larger players providing easy off-ramps to smaller players but I don’t recall anything about IPOs. Indeed, Figma’s IPO is precisely because she undid the pending Adobe / Figma merger if I recall correctly.
a better approach might be to farming out shares to stakeholders. that seems a lot more dynamic and self-correcting than periodic taxation battles after the fact
Khan was largely ineffectual. The current administration, if it can be blamed on SV at all, is more likely to be the result of Harris's insanely ill-timed proposal to tax unrealized capital gains just as election season was kicking into high gear.
IMO Khan was by far the best we've had in at least 2 decades. Her FCC even got a judge to rule to break up Google! The biggest downside Khan had was being attached to a 1 term president. There's just not that many court cases against trillion dollar companies you can take from investigation to winning the appeal on in 4 years
All true, and I'm not making a value statement about whether her influence was good or bad. However, Khan only threatened the oligarchs' companies, while Harris point-blank threatened their fortunes.
Don't pick a fight with people who buy ink by the barrel and bandwidth by the exabyte-second. Or at least, don't do it a month before an election.
The oligarchs hated Kahn with the intensity of a thousand burning suns. If you listened to All In all they were doing is ranting about her and Gary Gensler.
That being said, Kamala's refusal to run on Kahn's record definitely helped cost her the election. She thought she could play footsie with Wall Street and SV by backchanneling that she would fire Kahn, so she felt like she couldn't say anything good about Kahn without upsetting the oligarchs, but what she was doing was really popular.
She was largely ineffectual because she was cock-blocked by the ruling classes. I lean libertarian-capitalist and still I think this. Although it's not a settled debate in the classic liberal or libertarian traditions, there are plenty of arguments in them against the excessive concentration of power.
Samsung lost a large percentage of market share to their competitors in the last couple years, so I'm pretty sure they already have to participate in markets.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount.
This is in the right spirit but you want two things to be different about it.
The first is that the threshold for a given industry doesn't make sense as a dollar amount, it makes sense as a market share percentage. Having more than 15% market share should be a thing companies don't want, regardless of whether it's a $100 trillion industry or a $100 million one.
And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
So, what you want is a rule that if a company has more than 15% market share, the entire general public is allowed to sue them into bankruptcy for the offense of market consolidation. Which also removes the problem where they buy off the government prosecutors, because if they commit the offense then anybody can sue them.
> And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
That's not really a convincing argument. The government is the body for setting up the economic rules, it is not bound by it. The government doesn't have revenue or profit. Money is created by the government, it doesn't have a value yet. The direct financing of actions through taxes is not done for the government, but a way for the government to project the costs of the governmental action into the economy. Sure, there are a lot of idiots now-a-days, that think a state should work like a business and make profits, but they are misled.
And who determines what makes for a good market share size to be the threshold?
And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know). It's a bad set of policy imho.
A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition. If a company, such as AWS, is getting a lot of marketshare, but their profit margins is still high, then the gov't should incentivize competition by funding or giving loans to businesses that want to compete with AWS.
However, if AWS's profit margins, even at high market share, remains very low (e.g., amazon's commerce side), then there's no need for the gov't to "step in" at all, as there would be no incentive for any competitor to try enter the market due to low margins.
The goal is to not have it happen, because the company is going to see that they're only slightly below the threshold and voluntarily split themselves into smaller pieces and buy themselves a safety margin because if they don't everybody knows the lawsuits are going to vaporize them once they exceed the threshold.
> And who determines what makes for a good market share size to be the threshold?
Anything in the vicinity of 5%-15% would be fine.
> And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know).
This is extremely rare and the circumstances where it happens aren't a mystery. It's when entering the market has extremely high fixed costs but then the unit cost of usage is negligible, e.g. it costs a huge amount of money to install water and sewer but then the incremental cost of someone washing their hands is insignificant.
For those things you either have the government do them, or if it's a private company then it's a regulated utility which is completely banned from anything that even vaguely resembles vertical integration as the price of being allowed to have more than the threshold amount of market share.
> A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition.
The problem is generally caused by the incumbents capturing the government and then enacting rules that inhibit rather than increase competition. That's why you need anyone to be able to initiate the lawsuit, so they can't capture the government department which is supposed to be thwarting them because then it's the entire public.
so why not solve this issue directly? Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit. So unless the law suit enables the accuser to wholesale take a piece of that company as private property from the owners - which no law currently would allow nor have precedents for - why would anyone expend private money for a public good?
And in any case, i don't the apathy going away, even if the law suit was free. Because currently, the same apathy is allowing regulatory capture in the first place. So solving public apathy first, and foremost, is the solution.
> Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
It's mostly easy because the people doing it are good at lying. When they create a rule it isn't called the "mandate this company's product rule" or the "increase fixed costs to lock out smaller competitors rule", it's sold as a safety measure or consumer protection or some other pretext, even though the effect is to raise costs to the benefit of the companies getting the money or exclude competitors to the benefit of the incumbents.
Or they simply don't prosecute antitrust violations, and then there is nothing to audit because there is nothing happening, meanwhile people are kept distracted with other things.
> The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit.
It does net them more profit. The premise is that having more than the threshold amount of market share is a strict liability antitrust violation, which allows any customer or prospective customer (i.e. anyone) to sue them for it. The person who files the lawsuit would get the money, the same as someone who sues a company for pollution or fraud.
The point of letting people sue you for polluting or fraud or, in this case, market consolidation, isn't to make plaintiffs rich, it's to deter the thing you don't want companies to do. The goal isn't to have a lot of lawsuits, the goal is to have companies not want the market to consolidate and actively prevent it because if it happens they'll get sued.
> So solving public apathy first, and foremost, is the solution.
Apathy is cyclical. People don't care until the problem gets bad enough, then they care enough to demand change and make it go away for a while, then they stop caring until it gets bad enough again.
But you don't want people to have to die or get severely abused before the problem gets addressed. What you want is to change the structure of the system to prevent it from getting that bad to begin with, by making sure that the power to nip the problem in the bud (i.e. stop market consolidation at 5% or 15% instead of 50% or 90%) is held by someone who will actually exercise it, which can be accomplished by granting that power to everyone affected, which in this context is each and every member of the public.
This would permanently increase DRAM prices. Memory fabricators either earn billions of dollars in income each year or they can't keep going. There are no little Mom and Pop businesses that can do photolithography on leading process nodes.
Chip fabs used to be like book publishers; you don't have to own a printing press to be an author. Carver Mead even described his vision of the industry that way.
Nowadays you have to get your cell libraries and a large chunk of your toolchain from the fab. Of course it's laundered through cadence+synopsys, but it's still coming from the fab. You have to buy your masks from the fab (heck they aren't even allowed to leave the fab so do you really own them?). And on and on.
For the record I don't agree with the "exponential" part, but otherwise this is an underappreciated and powerful technique.
In another comment you proposed a sane version of the parent proposal. I wouldn't have commented if fpoling had originally floated that scheme. I was mainly objecting to drastically increasing taxes "once a company starts to earn above, say, 1 billion" without regard for the minimum viable scale of different businesses.
I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
These things DO NOT SCALE... you cant have 10,000 people running printing presses in their basement to crank out the NYT every day. A modern chip fab has more in common with the printer for the NYT than it does with what you can crank out in your garage.
Let's look at TSMC's plant in AZ. They went and asked intel "hey where are you sourcing your sulfuric acid from. When they looked at the American vendors TSMC asked intel "how are you working with this". Intels response was that it was the best they could get.
It was not.
TSMC now imports sulfuric acid from Taiwan, because it needs to be outrageously pure. Intel is doing the same.
Every single part, component, step and setup in the chain is like that. There is so much arcane knowledge that loss of workers represents a serious set back. There are people in the production chain, with PHD's, who are literally training their successors because thats sort of the only option.
Do you know who has been trying the approach you are proposing? China. It has not worked.
Complexity of the fab processes is isn't what the parent was talking about. They're talking about the major changes in the relationship between fabless semiconductor companies and commercial foundries.
The complexity of actual fabrication was always, and still is, entirely within the foundry. But in the early days of that model, designs could be more easily handed off at the logical level, leaving the physical design to back end companies, which makes designs much more portable between foundries. (The publisher analogy.) What's changed is that the complexity of physical design has exploded, and you can't make the handoff at nearly as high a level, and there is much more work that depends directly on the specific process you are targeting. Much more work at the physical level falls to the fabless semi companies. So it is much more work to retarget a design to a different foundry or process.
> I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
You can absolutely manufacture a convincingly-professional, current-generation book in your basement with a practically-small capital investment.
You cannot manufacture a convincingly-professional chip (being generous: feature size and process technology from the last two decades) in your basement without a 6-7 figure capital expenditure, and even then - good luck.
Is that revenue, or profit? If revenue, it'll slam certain kinds of high-volume low-profit businesses, and if it's profit then the company will just arrange to have big compensation "expenses" for executives.
The latter would have to be backstopped by taxes on individual income.
The sane version of this proposal omits the "exponential" part, applies to profits (net income), and makes the tax rate industry-specific (just like Washington State's revenue tax).
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
Diversity is good for populations. If you have a tiny pool of individuals with mostly the same traits (in this case I mean things like culture, education, morality, ethics, rather than class and race - though there are obvious correlations) then you get what some other comments are describing as being effectively centralized planning with extra steps, rather than a market of competing ideas.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
It's around 3.4% GDP. That puts us in the top 10% or so worldwide, but it's not ridiculously high. It's on a similar level as countries such as Morocco and Colombia, which aren't known for excessive military spending. It's still kind of high for a country with no nearby enemies, but for the most part, US military spending is large because the US economy is large.
It's around 16% of the total federal budget. To be fair about 1/3 of "military spending" is actually Salaries, Medical, Housing and GI/Retirement costs.
It's also the case that none of the CIA, NSA or DHS budgets show up under the military, even though they're performing some of the same functions that would be handled by militaries in other countries.
We also have "black appropriations." So the total of the spending on surveillance and kinetic operations is often unknowable. Add to this the fact the Pentagon has never successfully performed an audit and I think people are right to be suspicious of the topline "fraction of GDP" number.
Military spending is a type of wealfare for the wealthy it is one of the only forms of public or government spending that doesn't crowd out private investors, the way public housing or publicly funded hospitals do. The high military spending and the contractor class often vote more conservative than typical for their demographic and economic peers It's been high since WW2, with maybe a slight drop in the late 70s. The current stat of "3.4 times gdp" ignores the fact that a large part of our national debt is from the military and war budgets. I saw a statistic in the mid 1990s that if we had kept our military budget at inflation adjusted levels equal to 1976 our debt would have gone to zero as early as 1994.
Our national debt is from our unwillingness to raise taxes to balance the budget. Federal spending is somewhat high historically, but not absurdly so. Relative to the economy, it's at about the same level as it was in the 1980s. Measured as a percentage of GDP, the current military budget is the lowest since before the Second World War, aside from a brief period at the end of the 1990s where it was slightly lower.
Comparing budgets by adjusting for inflation doesn't make any sense. A budget that served a country of 218 million in 1976 would, when adjusted for inflation, serve a country of 218 million in 2026. Percentage of GDP is what you want to look at.
But federal spending has been historically high ever since like the New Deal.
Budget-to-GDP ratio in the US is close to 40%. (On that note, you should really consider federal + state combined rather than just federal.)
In early 1900s this same ratio was around 5-10%.
It has been increasing pretty much everywhere during the 20th century. It has made me wonder whether much of the prosperity we've seen and felt might not be a result of this ever-increasing percentage. Essentially we're spending more and more and that makes it feel like we're progressing faster than we are. Eventually it's going to have to stop though and I dread what happens when we do.
The New Deal was 90+ years ago. At some point it stops being abnormally high and becomes just how things are done.
I don't see why we'd eventually have to stop this level of spending. The debt is unsustainable, but that's a policy choice to keep taxes too low for the level of spending we've chosen.
Exactly. So instead of electing the people who will allocate the resources, the people who are successful in one thing are given the right to manage the resources for whatever they wish and they can keep being very wrong for very long time when other people are deprived from the resources due to the mismanagement and can't do anything about it.
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
how is this centralized planning? It’s a corporate decision making operating in a free market to optimize for what majority shareholders want (though the majority of shares are owned by few).
I think the implied thought (?) is there is a similarity between central planning and oligopoly bandwagoning. To my eye, the causes and dynamics are different enough to warrant bucketing them separately.
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
You'll just get a different form of power concentration. Do you think the Soviet Union didn't have power concentration in individuals? Of course it did, that's why the general secretary of the party was more important than the actual heads of state and government.
No? I'm saying that power concentration is pretty much unavoidable. The question is more about what they can do with that power. I suspect that people getting more power through wealth in the modern world is better than people concentrating power through politics.
> I'm saying that power concentration is pretty much unavoidable.
It's avoidable by formalizing the execution of power. The head of state is very powerful, but he can't create laws or anything. That all needs to be done be the parliament, which is several hundred people.
I don't think it's unavoidable. I don't see why you couldn't have a relatively weak government that's otherwise pretty laissez-faire besides taxing the hell out of extreme wealth. And a strong government doesn't have to have extremely powerful individuals. Power can be divided, and representatives are ultimately accountable to the people.
What you're saying basically boils down to: kings are inevitable, might as well choose them by economic success instead of the more old-fashioned approaches. I reject the first part.
A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power than the one you're fighting against. You're proposing to cure the common cold with AIDS.
> A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power
Yes. That's the idea. Make the largest concentration of power an elected body auditable by the commons and whose actions are formalized by a bunch of rules, that they can choose, but still need to stick to.
>It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
> This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
That's the case in a healthy competitive market. Once you have a monopoly or an oligopoly, you get into central planning territory.
Ok, but recall the context (see above): I’m saying one can understand how firms operate without making a connection to central planning (setting production targets and investment decisions from the top-down without prices).
Economists have concepts and models for monopolies and oligopolies — and the way they operate are quite different from the practice of central planning.
I’m talking from within the frame of economic concepts, and I’m striving to use words as understood in the field. At times I value metaphorical thinking, but here in the case of economics, we don’t need to bend words when other fitting concepts are readily available and battle-tested.
An example: If someone calls corporate consolidation “central planning,” they’ve lost the ability to analyze it properly. The relevant questions for oligopolies (strategic behavior, barriers to entry, tacit collusion) are completely different from central planning questions (calculation problems, information aggregation, incentive alignment).
When technical fields have already solved a conceptual problem through careful definition and model building, importing loose metaphorical language degrades analytical clarity.
If you want to point to or propose a different model than the usual economic dogma, I’m all ears, by the way.
I agree, that this discussion isn't based on proper economic terms, but on laymen understanding.
The claim isn't that it is exactly like central planning like a state, but very similar through the view on the whole society. You have a powerful caucus, no longer bound by reality (competition), making decisions, that they think are good, which effectively set the pace for the whole economy field. Whether this caucus formed through rigged elections or by inheritance of companies isn't all that relevant. It would be quite a different story, if the state would enforce its monopoly on (political) power and governing, but it refuses to do so now-a-days.
> The relevant questions for oligopolies (strategic behavior, barriers to entry, tacit collusion) are completely different from central planning questions (calculation problems, information aggregation, incentive alignment).
The observation on these oligopolies, that are now larger than some states is, that they seem to lack in their strategic reasoning and are more built on vibes of their leader, which is subject to blindness due to sycophants, much like in an authoritarian regime. Also they tend to treat the whole market as their internal planning problem.
> but here in the case of economics, we don’t need to bend words when other fitting concepts are readily available and battle-tested.
I think a majority of commenters on HN are not as well-versed in Economics as you, so would value elaboration on modern monopolies. I think they differ a bit from classical monopolies in their amount of ties to the government and interference into elections. Not that lobbying isn't typical for monopolies, but modern monopolies seem to not need to lobby anymore, but simply do and dictate.
Company prices resources within itself completely arbitrarily. How much the hour of work of an employee A is worth with the company and and how much using paperclip costs has no relation how much these things actually cost in the real money. Once they are acquired by company they are utilized not according to their value but to central plans instead. This way paperclip might get vastly overvalued and scarce while hour of work can be vastly undervalued and wasted.
> Company prices resources within itself completely arbitrarily.
I wouldn’t phrase it this way — to me, this implies unpredictability and/or a lack of rationale. Perhaps you simply mean “internal managers at companies do not necessarily price resources using market mechanisms” which I would agree with.
Many fields of study give insight to the various kinds of distortions that arise from human psychology and negotiation, etc.
To make sure we’re on the same page… In economics, “central planning” refers to a system where a central authority (typically the state) makes comprehensive decisions about production, investment, and resource allocation across an entire economy, replacing market mechanisms. This is associated with command economies like the Soviet Union’s Gosplan system.
And of course I will grant firms use hierarchical coordination mechanisms internally (managers allocate resources by command rather than prices).
I suppose my angle here is to be clear that firms are typically a kind of hybrid entity: they mix various coordination mechanisms (prices and hierarchy). This makes them quite different from centrally planned economies.
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
The problem is that memory manufacturing is hard enough that there are essentially 3 major companies that do it globally: Samsung, SK Hynix, and Micron.
> This is part of how free markets self correct, misallocate resources and you run out of resources.
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
There is a reason why there used to be market regulation and breaking up of monopolies. We are now-a-days trying out changes to the stable state from centuries, because that would be so yesterday, and will soon find out, why that state was chosen in the first place.
It’s maybe new to you (you’re one of today’s lucky 10,000!), but this kind of market failure has been going on since at least the south sea bubble and tulip mania, if not all the way back to Roman times.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
>society is better served by having it produced by the few small companies than the one big company.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
By the time a company becomes a monopoly, it is immensely powerful - politically and monetarily - getting rid of it or splitting it up is near impossible. Monopoly laws are near impossible to apply as the corporation has sufficient money and influence to turn politicians into servile puppets.
Best to nip corpos before they gain more revenue than a nation state and become "too big to fail".
For example: allocating the resources to only few industries deprives everyone else: small players, hobbyists, gamers, tinkerers from opportunities to play with their toys. And small players playing with random toys is a source of multiple innovations.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
It's not exactly a new type of failure. It's roughly equivalent to Riccardian rent, or pecuniary externalities for the general term. Though I suppose this is a speculative variant, which could be worse somehow.
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
OpenAI appears to have bought the DRAM, not to use it, as they are apparently buying it in unfinished form, but explicitly to take it off the market and cause this massive price increase & squash competition.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Or not cause inflation, rising cost of living etc. People said the same about crypto GPUs but it never really happened in the end. Those cheap pre-LHR RTX cards never really entered the picture.
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
Just like the beautiful woman who's luxury bag purchase she doesn't actually need, we can sit here and judge her for it, but at the end of the day it's not our money she's buying Louis Vuitton with, and we're not the one she's going home with.
Anyone who owns shares in US companies (most people here) both are ‘going home with’ the companies involved, and are buying ‘the bags’.
Not to mention all the people buying the bonds used to fund the whole AI data center buildout, which is a ton of probably pension funds and old folks planning for retirement (also probably more than a few millionaire/billionaires!).
The market failure results from those people having way more money than logic and economic principles dictate they should. A person would normally have to make a lot of good decisions in a row to get that much money, and would be expected continue making good decisions, but also wouldn't live long enough to reach these extreme amounts. However, repeated misallocation by the federal government over the last several decades (i.e. excessive money printing) resulted in people getting repeatedly rewarded for making the right kind of bad economic decisions instead.
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
At $DAYJOB, we have had confirmed and paid for orders be cancelled within the last week due to price hikes. One DDR5 server configuration went from ~$13k to near $25k USD in a matter of days.
We also were looking for DDR4 memory for some older machines and that has shot up 2x as well.
> Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
I bet that friendly Chinese entrepreneurs will sell inexpensive E1.S to m.2 adapters, and maybe even PCIe riser cards for putting an OAM and a bunch of fans, and maybe even an HDMI output. Good hardware won't be wasted, given some demand.
> Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different.
All you need is a fixed-latency, dumb translator bridge where the adapter forces everything into a simplified JEDEC-compliant mode.
CA/CK Line Translator with a Fixed Retimer as the biggest mismatch between RDIMM/UDIMM is the command/address path.
RDIMMs route CA/CK to RCD to DRAM, and the UDIMMs route CA/CK to DRAM directly, take the UDIMM CA/CK, delay + buffer + level shift it, feed it into a "RCD" like input using a delay locked loops (DLL).
Throw in a SPD translator, PMIC and voltage correction, DQ line conditioning and some other stuff into a 10–12-layer PCB, retimer chips, vrm, and level shifters.
It would cost about $40 million to fab and about $100 per adapter but would make bank with all the spare UDIMMs when the bubble bursts.
I imagine the cost is primarily in the actual DRAM chips on the DIMM. So availability of RDIMMs on the market will affect DRAM prices anyway. These days lots of motherboards come with Oculink, etc. and you can get a U.2 PCIe card for rather cheap.
I put together a small server with mostly commodity parts.
The problem is that it is not entirely clear that the hyperscalers are buying DDR5, instead it seems that supplies are being diverted so that more HBM/GDDR wafers can be produced.
HBM/GDDR is not necessarily as useful to the average person as DDR4/DDR5
I see it a bit differently. In marketing, companies like AppLovin with the Axon Engine and Zeta Global with Athena are already showing strong profitability, both in earnings and free cash flow. They’re also delivering noticeably higher returns on ad spend compared to pre-AI tools for their customers. This is the area I’m researching most closely, so I can only speak for marketing, but I’d love to hear from others seeing similar results in their industries.
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs suck for gaming, I have seen a video from a guy playing Red Dead Redemption 2 on a H100 at a whooping 8 FPS! That's after some hacks, because otherwise it wouldn't run at all.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!
But anyway, the trick is to run it in the winter and keep your house warm.
I don't think I am. I don't think economies of scale on hardware will drive costs below free, and while subscription providers might be willing to offer services below the cost of the hardware that runs them, I don't think they'll offer services below the cost of the electricity that runs them.
And while data centers might sign favorable contracts, I don't think they are getting electricity that far below retail.
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.
I'm especially annoyed that this is most likely intentional.
(Not at all)openAI saw they are getting behind their competitors (gpt 5 and 5.1 were progressively worse for my use case - actual problem solving and tweaking existing scripts) are getting better. (Claude and sonnet were miles ahead and I used gpt only due to lower price). Now not only open weights models like Qwen3 and kimik2 exceeded their capability and you can run them at home if you have the hardware or for peanuts on a variety of providers. Cheap-er hardware like strix halo (and Nvidia dgx) made 128gb vram achievable to enthusiast. And Google is eating their punch with Gemini.
All while their CFO starts talking about government bailing them out from spending they cannot possibly fund.
Of course they will attempt to blow up the entire hardware market so if they AI flops they will be able to at least re not you hardware like AWS.
It’s telling that the comment is highly upvoted (as I write this comment, anyway) despite being incoherent and incomplete in multiple places. I guess being generically angry and complaining about popular targets like OpenAI is an easy way to earn upvotes from visitors who don’t actually read, just scan comments for keywords and vibes.
This problem is so much worse when you look at server mobo configurations that basically jumped from 8 to 12 slots. Meaning you need 50% more sticks to saturate versus Epyc 7003/2. I was hoping to build a Genoa-X server for compute and ram cost just went bonkers (that’s on a nearly 2-yo system). I decided to shelve that idea.
Memory prices have always been a racket. In the good old days news of a memory factory burning periodically raised the memory prices. Nowadays price fixing cartels don’t need something so elaborate.
Somebody do the math on when we will reliably start running out of Grid Power. Than only this "AI builout" will slow down. Manufactiring generators is boring, and very less invested than manufacturing AI servers.
That's why its increasingly more important to find answers how to build these models that work sustainably. The approach of training with HUGE amount of data requiring HUGE infra seems to have blinded the hype-bros that they are not planning to innovate to do it in a small-scale.
Wild experience building a PC today and discovering the prices are less competitive with Macs than they’ve always been. Building a well-appointed gaming/production/CAD rig is suddenly very expensive between RAM, GPU, and nvme prices being so high.
I can‘t imagine Apple doesn‘t have capacity booked well in advance, and their suppliers aren‘t going to stiff them because they‘d lose those long-term contracts. Sure, if the shortage lasts a year or more, there‘ll be issues, but if it‘s short term they might be fine.
Basically every integrated circuit is exempt from retaliatory tariffs, current custom MacBook Pros are shipping from China direct: which tariffs are you referring to?
So glad I bought 128gb ddr5 for my desktop a year ago... I usually don't need it all but it was cheap at the time. Most I use it for is cpu offloading for LLMs too big for my 3090 and for running 10 or so small VMs for my projects.
We've been getting increasingly fucked for years on housing prices, healthcare, food, live entertainment, etc. Consumer electronics were one of the few areas that you could at least argue you were getting more value per dollar each year. GPU's have been a mess for awhile now but now it seems like it's just going to be everything.
I mentioned this previously in a thread about node sizes and got down voted for it, but I stand by my opinion: the rest of the world ie normal people need China to become competitive in chip manufacturing.
Without that competition, everyday consumers are going to get priced out of the market by major corporations. We have reached a point in CPU technology where newer tech is no longer automatically cheaper and faster to make; therefore, we need more competition to keep prices down.
Except a lot of other people don't live in countries that are allied with Chinas vision. We need more competition among our allies, not have the only competition from our enemy.
> hbm chips are now emerging as another bottleneck in the development of those models. Both sk Hynix and Micron, an American chipmaker, have already pre-sold most of their hbm production for next year. Both are pouring billions of dollars into expanding capacity, but that will take time. Meanwhile Samsung, which manufactures 35% of the world’s hbm chips, has been plagued by production issues and reportedly plans to cut its output of the chips next year by a tenth.
Companies that invested in CXL got their money's worth. CXL is basically older RAM connected over PCI-e. Not only you're not throwing away RAM which cannot be used with the current generation of motherboard and chipsets, but you also have a way to get a lot of slower memory for applications that don't need the best and the newest.
If we're going to see retailers price-gouging on DDR5, maybe people will be willing to buy slightly older gear with DDR4 (and corresponding motherboard and CPU).
Especially for systems for which the workloads are actually bound by GPU compute, network, or storage.
I've just looked up the current price of 2x8GB DDR4 I bought from CCL back in June for a basic PC for my mother. Was £34, now £75. That's nuts. Interesting to know the old stuff currently has value though.
2 months ago there were a load of second gen xeon scalable servers on offer. Now every one of them has had the ram stripped out and its just the chassis on offer.
Nice that people are recognizing that they don't need DDR5 for most workloads. I have some DDR4 that I'm not currently using, so I just posted it for sale. :)
I'm still on DDR4 but I hope this price gouging will be over by the time I need to finally upgrade :( I have a Ryzen so I did upgrade to the latest AM4 generation.
I just snagged an Asrock Rack mobo (X570), 5900x and 128gb ecc ddr4 for $680. Felt like a steal with how memory prices are going these days, ECC to boot.
Even if production capacity wasn't shared/shifting to the higher end products (which it seemingly is), there's certainly going to be an increase in demand for DDR4 as it acts as a substitute good. Prices are already up significantly.
Gamers Nexus is reporting increasing DDR4 prices, but it’s unclear to what extent it’s driven by the DDR5 market. DDR4 production is expected to be slowing anyway given the move to DDR5.
Haven't these memory companies been caught price fixing multiple times over the years? Just how sure are we the AI bubble is the entire reason for these absurd prices?
> Just how sure are we the AI bubble is the entire reason for these absurd prices?
We're not, and market dictates that they don't have to talk to know to jack up the prices.
This ram price spike is leading Nvidia reporting for this quarter: gross margins were 70 percent. It's looking like their year over year increase in margins (double) is not because it came anywhere close to shipping double the number of units.
Meanwhile if you look at Micron their gross margin was 41% for fiscal year 2025, and 2024 looks to be 24%.
Micron and its peers, are competing with Nvidia for shareholder dollars (the CEO's real customer). Them jacking up prices is because there is enough of the market, dumb enough, to bear it right this second. And every CEO has to be looking at those numbers and thinking the same things: "Where is my cut of the pie, why aren't we at 60 percent".
We're now at a point where hardware costs are going to inhibit development. Everyone short of the biggest players are now locked out, and thats not sustainable. Of the AI ventures there is only one that seems to have a reasonable product, and possibly reasonable financials. Many of the other players are likely going to be able to weather the write downs.
>Samsung and its peers, SK hynix and Micron Technology, have redirected much of their fabrication capacity to high-end chips used in AI servers. While this shift yields higher margins, it leaves less capacity for traditional DRAM products that power laptops, desktops, and mainstream servers.
So if the AI bubble does pop in early 2026, you will get a tsunami of cheap server RAM. You still won't be able to find cheap PC RAM. So either way, the short term future of computing is firmly fixed in the cloud.
So first it was bitcoin/crypto, now it's ai. pc gaming is dead at this point. i wonder if it will force developers to care about doing more with less hardware and optimize now.
Many studios don't even hire rendering engineers anymore. Much of AAA is UE5 slop. And then there is the looming AI slop, which publishers are already thinking about. I think it'll burst at some point, but it'll get worse before it gets better.
I had a simple proxmox/k8s cluster going, and fitting RAM for nodes was the last on my list. It was cheapo ol' DDR4.
Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.
My 2022 GPU has 24GB of ram. It's like 50% more than what is similarly affordable today. It's fucked up and I'd rather slow down my spending and see the whole market go down than get scammed by hype.
And that will result in even more resources being allocated into the "big spenders". We are for a long time, in a death spiral for the whole PC field. If it was not crypto mining (multiple times), then it was HDD mining, then it was pandemic, and now its AI.
What used to be a stable market, that was predictable, has become ultra expensive. And now the whole SSD / DDR pricing are going to hurt even more.
Worst of all is, that a lot of resources are now going to enterprise hardware. So even if the AI bubble goes down, its not like the market will be flooded with cheap NVMEs or cheaper DDR sticks, as that production will have gone into 2.5" U.3 drives and LPDDR memory or the likes.
This website mentions the price increase for DDR, but AI companies use Nvidia GPUs, which probably use HBM or GDDR. So I assume the respective price increase for soldered-on memory on graphics cards is even steeper.
Semiconductor companies have been bitten in the past by scaling up production into a bubble, so of course Samsung just raises prices. When you buy DRAM, remember that you are financing oligarchs and that Stargate has lied yet again.
So AI drives prices up, directly and indirectly. I am not happy, in part because I actually need to purchase new hardware (eventually, and unfortunately quite soon, probably next year already, again).
I think there must be a tax of all those AI corporations - they cost us as society WAY too much. We need to bring this into the discussion; right now lobbyists such as the orange king want to ban all discussions therein aka making AI investments exempt from numerous things. This is leeching on the general taxpayers, in all countries. It is not acceptable.
>> I am not happy ... I think there must be a tax of all those AI corporations
There are a lot of elements to this AI shit-show that I don't like or worry about, but taxing them specifically because they're driving up the price of memory when you want some is not really a "societal cost". You then mention something about "general taxpayers" - didn't you just lobby to make them a super tax payer? Go ahead and rant, but seems like pretty basic supply & demand, and keep some perspective; it's computer memory not bread.
I pre-ordered and picked up a framework desktop with 128GB of DDR5-8000 inside of it. This is the type of system that is the a indirect byproduct of the change towards AI - it may not have been what AM was originally intending with the AI Max 395+ line - but it definitly is the kind of optimized thinking that will drive AI into the hands of consumers.
That's part of the reason I think this boom-bust cycle might be a bit different. Hopefully, Intel can use some of its capacity that they have coming up in the foundry to service this need.
I'm so mad about this, I need DDR5 for a new mini-PC I bought and prices have literally gone up by 2.5x..
128GB used to be 400$ in June, and now it's over $1,000 for the same 2x64GB set..
I have no idea if/when prices will come back down but it sucks.
Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here: https://storagesearch.com/memory-boom-bust-cycles.html
(That's just me eyeballing it, feel free to do the math)
I am so glad both top rated and majority of comments on HN finally understands DRAM industry instead of constant DRAM is a cartel that is why things are expensive.
Also worth mentioning DRAM and NAND's profit from Samsung is what keep the Samsung Foundry fighting TSMC. Especially for those who thinks TSMC is somehow a monopoly.
Another things to point out which is not mentioned yet, China is working on both DRAM and NAND. Both LPDDR5 and Stacked NAND are already in production and waiting for yield and scale. Higher Price will finally be perfect timing for them to join the commodity DRAM and NAND race. Good for consumer I suppose, not so good for a lot of other things which I wont go into.
DRAM manufacturers have literally been convicted of price fixing in the past why do you have to white knight for them?
Both stories can be true.
The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry, and that strategy requires significant capital expenditures, and the industry does not get (or does not want) outside investment to fund it, and if any of the firms defects and keeps prices low the others cannot execute on the strategy, so they all agree to raise prices.
Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal, and they have to get smacked down by regulators.
> The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry.. Then, after the strategy succeeds, they have gotten addicted to the higher revenues, they do not allow prices to fall as fast as they should, their coordination becomes blatantly illegal..
So said and did the infamous Phoebus cartel, to unnaturally "fix" the prices and quality of light bulbs.
https://spectrum.ieee.org/the-great-lightbulb-conspiracy
https://en.wikipedia.org/wiki/Phoebus_cartel
For more than a century, one strange mystery has puzzled the world: why do old light bulbs last for decades while modern bulbs barely survive a couple of years?
The answer lies in a secret meeting held in Geneva, Switzerland in 1924, where the world’s biggest light bulb companies formed the notorious Phoebus Cartel.
Their mission was simple but shocking: control the global market, set fixed prices, and most importantly… reduce bulb lifespan.
Before this cartel, bulbs could easily run for 2500+ hours. But after the Phoebus Cartel pact and actions, all companies were forced to limit lifespan to just 1000 hours. More failure meant more purchases. More purchases meant more profit. Any company who refused faced heavy financial penalties.
The most unbelievable proof is the world-famous Livermore Fire Station bulb in California, glowing since 1901. More than 120 years old. Still alive. While our new incandescent bulbs die in 1–2 years.
Though the Phoebus cartel was dissolved in the 1930s due to government pressure, its impact still shadows modern manufacturing. Planned obsolescence didn’t just begin here… but Phoebus made it industrial.
https://m.youtube.com/watch?v=0U5uU6nzgO8
The Phoebus cartel didn't collude just to make the light bulbs have a shorter lifespan. They upped the standard illumination a bulb emitted so that consumers needed fewer of them to see well. With an incandescent you have a kind of sliding scale of brightness:longevity (with curves on each end that quickly go exponential, hence the longest lasting light bulb that's so dim you can barely read by its light). The brighter the bulb, the shorter the lifespan.
https://www.youtube.com/watch?v=zb7Bs98KmnY
Indeed.
Also, incandescent bulb lifespan is reduced by repeated power cycling. Not only is the legendary firehouse bulb extremely dim, it has been turned off and back on again very few times. Leaving all your lights on all the time would be a waste of power for the average household, more expensive than just replacing the bulbs every so often.
Also lightbulb dimmers were a thing back in the day, so you could always buy more lightbulbs and lower the brightness of each to take advantage of that exponential curve in lifespan.
> The firms can coordinate by agreeing on a strategy they deem necessary for the future of the industry
As long as it doesn't fall into the "collusion" prohibitions of the relevant competition law.
> “People of the same trade seldom meet … but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.”
Adam Smith, The Wealth of Nations (1776)
Most of us who've been on Earth for a while know that courts often get it wrong. Even if the particular court decision you mention was correct does not mean that price fixing is the main reason or the underlying reason DRAM prices sometime go up.
They blatantly were doing it, admitted to it, and did it again later. What kind of crazy is this?
Is this the ‘but he loves me, he wouldn’t hit me again’ of the tech world?
re: other things, I bet I agree.
Historically, yes. But we haven't had historical demand for AI stuff before. What happens when OpenAI and NVIDIA monopolize the majority of DRAM output?
I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.
https://www.tomshardware.com/pc-components/storage/perfect-s...
> The shortage could last a decade.
Do we really think the current level of AI-driven data center demand will continue indefinitely? The world only needs so many pictures of bears wearing suits.
The pop culture perception of AI just being image and text generators is incorrect. AI is many things, they all need tons of RAM. Google is rolling out self-driving taxis in more and more cities for instance.
Congrats on engaging with the facetious part of my comment, but I think the question still stands: do you think the current level of AI-driven data center demand will continue indefinitely?
I feel like the question of how many computers are needed to steer a bunch of self-driving taxis probably has an answer, and I bet it's not anything even remotely close to what would justify a decade's worth of maximum investment in silicon for AI data centers, which is what we were talking about.
Data center AI is also completely uninteresting/non-useful for self driving Taxis, or any other self driving vehicle.
AI is needed to restart feudalism?
No, the 10% best scenario return on AI won't make it. The bubble is trying to replace all human labor, which is why it is a bubble in the first place. No one is being honest that AGI is not possible in this manner of tech. And Scale won't get them there.
The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.
> produce more mature technology ... DDR3/4
...except current peak in demand is mostly driven by build-out of AI capacity.
Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.
Even if AI stuff does really need DDR5, there must be lots of other applications that would ideally use DDR5 but can make do with DDR3/4 if there's a big difference in price
I mean, AI is currently hyped, so the most natural and logical assumption is that AI drives these prices up primarily. We need compensation from those AI corporations. They cost us too much.
There's not a difference between "consumer" DRAM and "enterprise" DRAM at the silicon level, they're cut from the same wafers at the end of the day.
Doesn't the same factory produce enterprise (i.e. ECC) and consumer (non-ECC) DRAM?
If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.
Conceptually, you can think of it as "RAID for memory".
A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.
An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.
Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.
That makes sense, thank you.
At the silicon level, it is the same.
Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.
The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.
Got it now, thanks!
Yes, but if new capacity is also redirected to be able to be sold as enterprise memory, we won't see better supply for consumer memory. As long as margins are better and demand is higher for enterprise memory, the average consumer is screwed.
Does it matter that AI hardware has such a shorter shelf life/faster upgrade cycle? Meaning we may see the ram chips resold/thrown back into the used market quicker than before?
Is there still a difference? I have DDR5 registered ECC in my computer.
I mean, the only difference we care about is how much of it is actual RAM vs HBM (to be used on GPUs) and how much it costs. We want it to be cheap. So yes, there's a difference if we're competing with enterprise customers for supply.
I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.
Maybe we'll get default to ECC in everything with this?
Anytime somebody is making a prediction for the tech industry involving a decade timespan I pull out my Fedora of Doubt and tip my cap to m’lady.
A LOT of businesses learned during Covid they can make more money by permanently reducing output and jacking prices. We might be witnessing the end times of economies of scale.
The idea is someone else comes in that's happy to eat their lunch by undercutting them. Unfortunately, we're probably limited to China doing that at this point as a lot of the existing players have literally been fined for price fixing before.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
It seems more likely that someone else comes in and either colludes with the people who are screwing us to get a piece of the action or gets bought out by one of the big companies who started all this. Since the rare times companies get caught they only get weak slaps on the wrist where they only pay a fraction of what they made in profits (basically just the US demanding their cut) I don't have much faith things will improve any time soon.
Even China has no reason to reduce prices much for memory sold to the US when they know we have no choice but to buy at the prices already set by the cartel. I expect that if China does start making memory they'll sell it cheap within China and export it at much higher prices. Maybe we'll get a black market for cheap DRAM smuggled out of China though.
I think in part it is a system level response to the widespread just-in-time approach of those businesses' clients. A just-in-time client is very "flexible" on price when supply is squeezed. After that back and forth i think we'll see return to some degree of supply buffering(warehousing) to dampen down the supply levels/price shocks in the pipelines.
I thought that, too, but then the Nexperia shitstorm hit, and it was as if the industry had learned nothing at all from the COVID shortages.
Is this still the case in 2025, though?
In a traditional pork cycle there's a relatively large number of players and a relatively low investment cost. The DRAM market in the 1970s and 1980s operated quite similarly: you could build a fab for a few million dollars, and it could be done by a fab which also churned out regular logic - it's how Intel got started! There were dozens of DRAM-producing companies in the US alone.
But these days the market looks completely different. The market is roughly equally divided up between SK Hynix, Micron, and Samsung. Building a fab costs billions and can easily a year of 5 - if not a decade - from start to finish. Responding to current market conditions is basically impossible, you have to plan for the market you expect years from now.
Ignoring the current AI bubble, DRAM demand has become relatively stable - and so has the price. Unless there's a good reason to believe the current buying craze will last over a decade, why would the DRAM manufacturers risk significantly changing their plans and potentially creating an oversupply in the future? It's not like the high prices are hurting them...
Also, current political turbulence makes planning for the long term extremely risky.
Will the company be evicted from the country in 6 months? A year? Will there be 100% tariffs on competitions imports? Or 0%? Will there be an anti-labor gov’t in effect when the investment might mature, or a pro-labor?
The bigger the investment, the longer the investment timeframe, and the more sane the returns - the harder it is to make the investment happen.
High risk requires a correspondingly high potential return.
That everyone has to pay more for current production is a side effect of the uncertainty, because no one knows what the odds are of even future production actually happening, let along the next fancy wiz-bang technology.
But people do need the current production.
Nothing costs trillions.
If you had a trillion dollars you might find some things are for sale that otherwise wouldn't be...
To be fair, nobody HAS a trillion dollar either. They have stuff that may be worth a trillion dollar when sold.
Have you seen our debt recently?..
My guess is that they will plummet down when the AI bubble bursts.
A waiver is a waiver. The cost is per square mm. It’s pure supply and demand
No, a wafer is very much not a wafer. DRAM processes are very different from making logic*. You don't just make memory in your fab today and logic tomorrow. But even when you stay in your lane, the industry operates on very long cycles and needs scale to function at any reasonable price at all. You don't just dust off your backyard fab to make the odd bit of memory whenever it is convenient.
Nobody is going to do anything if they can't be sure that they'll be able to run the fab they built for a long time and sell most of what they make. Conversely fabs don't tend to idle a lot. Sometimes they're only built if their capacity is essentially sold already. Given how massive the AI bubble is looking right now, I personally wouldn't expect anyone to make a gamble building a new fab.
* Someone explained this at length on here a while ago, but I can't seem to find their comment. Should've favorited it.
Sure, yes the cost of producing a wafer is fixed. Opex didn’t change that much.
Following your reasoning, which is common in manufacturing, the capex needed is already allocated. So, where does the 2x price hike come from if not supply/demand?
The cost to produce did not go up 100%, or even 20%
Actually, DRAM fabs do get scaled down, very similar to the Middle East scaling down oil production.
DRAM/flash fab investment probably did get scaled down due to the formerly low prices, but once you do have a fab it makes sense to have it produce flat out. Then that chunk of potential production gets allocated into DRAM vs. HBM, various sorts of flash storage etc. But there's just no way around the fact that capacity is always going to be bottlenecked somehow, and a lot less likely to expand when margins are expected to be lower.
> Sometimes they're only built if their capacity is essentially sold already.
"Hyperscalers" already have multi-year contracts going. If the demand really was there, they could make it happen. Now it seems more like they're taking capacity from what would've been sold on the spot or quarterly markets. They already made their money.
I just looked at the invoice for my current PC parts that I bought in April 2016: I paid 177 EUR (~203 USD) for 32GB (DDR4-2800).
It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
Well, I've experienced both to some degree in the past. The previous long time with very similar hardware performance was when PCs were exorbitantly expensive and commodore 64 was the main "home computer" (at least in my country) over the latter 80s and early 90s.
That period of time had some benefits. Programmers learned to squeeze absolutely everything out of that hardware.
Perhaps writing software for today's hardware is again becoming the norm rather than being horribly inefficient and simply waiting for CPU/GPU power to double in 18 months.
I was lucky. I built my am5 7950x Ryzen pc with 2x48gb ddr5 2 years ago. I just bought 4x48gb kit a month ago with an idea to build another home server with the old 2*48gb kit.
Today my old g.skill 2x48gb kit costs Double what I paid for the 4x48gb.
Furthermore I bought two used rtx3090 (for AI) back then. A week ago I bought a third one for the same price... ,(for vram in my server).
> It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
But you’re cherry picking prices from a notable period of high prices (right now).
If you had run this comparison a few months ago or if you looked at averages, the same RAM would be much cheaper now.
We’re just consuming a lot of DRAM in general.
Olds remember the years around '95 when RAM stayed the exact same price per megabyte for what seemed a decade.
I paid about GBP 20K for the 192MB RAM in a Sun SPARC 5 workstation in 1995. That’s maybe $27K USD in 1995 dollars. Gulp.
There is or was a website that would let you plug in an Apple computer, and then tell you what you'd be worth if instead you'd bought Apple stock.
I put my G4 PowerBook into it once, and then vowed never to look at it again.
I bought a bunch of hard drives in 2021 (16TB Seagate Exos) that are now $50-$100 more expensive. It's depressing.
If the sticker price stayed the same since 2016, it got about 35% cheaper due to inflation.
Aside, $203 USD back then would be about $276 USD after inflation. Not a primary effect, but contributory.
I think that goes to show that official inflation benchmarks are not very practical / useful in terms of buckets of things that people actually buy or desire. If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
> If the bucket that measured inflation included computer parts (GPUs?), food and housing - i.e. all that the thing that a geek really needs inflation would be wayy higher...
A house is $500,000
A GPU is $500
You could put GPUs into the inflation bucket and it wouldn’t change anything. Inflation trackers count cost of living and things you pay monthly, not one time luxury expenses every 4 years that geeks buy for entertainment.
Also we’re likely comparing RAMs at different speeds and memory bandwidth.
Also need to account for the dollar decline vs other currencies (which yes is possibly somewhat factored into dollar inflation so you'd have to do the inflation calculation in Euros then convert to dollars accounting for the decline in value).
I just gave up and built an AM4 system with a 3090 because I had 128G of ddr4 udimms on hand the whole build was for less than just the memory would have cost for an AM5/ddr5 build.
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
If you don't need full ieee-754 double precision, ozaki scheme (emulation with tensor cores) might do the trick. It's been added (just a little bit) to cublas recently.
Ordered some servers 6 months ago ~12k USD per unit.
Same order, same bill of materials, 17.5K USD per unit today.
That is roughly a 5.5k increase for 768GB of DDR5 ECC memory and the 4 2tb nvme ssds.
My 64gb DDR5 kit started having stability issues running XMP a few weeks out of warranty. I bought it two years ago. Looked into replacing it and the same kit is now double the price. Bumping the voltage a bit and having better cooling gets it through memtest thankfully. The fun of building your own computer is pretty much gone for me these days.
Doubled in the last 4 months https://www.youtube.com/watch?v=o5Zc-FsUDCM
Upgraded by adding 64GB.. last Friday I sold the 32 GB I took out for what I paid for the 64 GB in July... insane
Time to start scouring used-PC sales to reclaim the RAM and sell it for a profit?
Have you not noticed the domain of the submitted article? Others are way, way ahead on that already.
(Including the submitter. In their comment history is "Tip: You can sell used server RAM or desktop modules through BuySellRam to recover value from old hardware." at https://news.ycombinator.com/item?id=45800881 and all of the submissions of this domain are from this user: https://news.ycombinator.com/from?site=buysellram.com )
If you can find used PCs being liquidated with DDR4 RAM that is fast enough for a modern build, then you might.
Old RAM that comes out of the PCs being sold at fire sale prices isn’t really in demand though. Even slower DDR4 grades aren’t seeing much demand.
but why wouldn't that used-PC simply increase in price due to the components becoming more expensive?
Information asymmetry
Why do we all need 128GB now? I was happy with 32.
Close a few Chrome tabs, and save some DDR5 for the rest of us. :-)
Last night, while writing a LaTeX article, with Ollama running for other purposes, Firefox with its hundreds of tabs, multiple PDF files open, my laptop's memory usage spiked up to 80GB RAM usage... And I was happy to have 128GB. The spike was probably due to some process stuck in an effing loop, but the process consuming more and more RAM didn't have any impact on the system's responsiveness, and I could calmly quit VSCode and restart it with all the serenity I could have in the middle of the night. Is there even a case where more RAM is not really better, except for its cost?
> Is there even a case where more RAM is not really better, except for its cost?
It depends. It takes more energy, which can be undesirable in battery powered devices like laptops and phones. Higher end memory can also generate more heat, which can be an issue.
But otherwise more RAM is usually better. Many OS's will dynamically use otherwise unused RAM space to cache filesystem reads, making subsequent reads faster and many databases will prefetch into memory if it is available, too.
Firefox is particularly good at having lots of tabs open and not using tons of memory.
Activity monitor claims firefox is using 3.1GB of ram. That said, I wholeheartedly agree that "more RAM less problems". The only case I can think of when it's not strictly better to have more is during hibernation (cf sleep) when the system has to write 128GB of ram to disk.In my experience firefox is "pretty good" about having lots of tabs and windows open if you don't mind it crashing every week or two.
I've not had a crash on Firefox in like a decade, basically since the Quantum update in like 2016.
Try living like I do. I currently have 1,838 tabs open across 9 different windows. On second thought, maybe don't live like I do...
On consumer chips the more memory modules you have the slower they all run. I.e. if you have a single module of DDR5 it might run at 5600MHz but if you have four of them they all get throttled to 3800MHz.
Intel's consumer processors (and therefore the mainboards/chipsets) used to have four memory channels, but around the year 2020 this was suddenly limited to two channels since the 12th generation (AMD's consumer processors had always two channels, with exception of Threadriper?).
However this does not make sense, as for more than a decade the processors have only grown increasing the number of threads, therefore two channels sounds like a negligent and deliberately imposed bottleneck to access the memory if one use all those threads (Lets say 3D render, Video postproduction, Games, and so on).
And if one want four channels to surpass such imposed bottleneck, the mainboards that nowadays have four channels don't contemplate consumer use, therefore they have one or two USB connectors with three or four LAN connectors at prohibitive prices.
We are talking about consumer quad-channel DDR4 machines ten years old, wildly spread, keeps being competent compared with current consumers ones, if not better. It is like if all were frozen along this years (and what remains to be seen with such pattern).
Now it is rumoured that AMD may opt for four channels for its consumer lines due to the increased number of pin connectors (good news if true).
It is a bad joke what the industry is doing to customers.
> Intel's consumer processors (and therefore the mainboards/chipsets) used to have four memory channels, but around the year 2020 this was suddenly limited to two channels since the 12th generation (AMD's consumer processors had always two channels, with exception of Threadriper?).
You need to re-check your sources. When AMD started doing integrated memory controllers in 2003, they had Socket 754 (single channel / 64-bit wide) for low-end consumer CPUs and Socket 940 (dual channel / 128-bit wide) for server and enthusiast destkop CPUs, but less than a year later they introduced Socket 939 (128-bit) and since then their mainstream desktop CPU sockets have all had a 128-bit wide memory interface. When Intel later also moved their memory controller from the motherboard to the CPU, they also used a 128-bit wide memory bus (starting with LGA 1156 in 2008).
There's never been a desktop CPU socket with a memory bus wider than 128 bits that wasn't a high-end/workstation/server counterpart to a mainstream consumer platform that used only a 128-bit wide memory bus. As far as I can tell, the CPU sockets supporting integrated graphics have all used a 128-bit wide memory bus. Pretty much all of the growth of desktop CPU core counts from dual core up to today's 16+ core parts has been working with the same bus width, and increased DRAM bandwidth to feed those extra cores has been entirely from running at higher speeds over the same number of wires.
What has regressed is that the enthusiast-oriented high-end desktop CPUs derived from server/workstation parts are much more expensive and less frequently updated than they used to be. Intel hasn't done a consumer-branded variant of their workstation CPUs in several generations; they've only been selling those parts under the Xeon branding. AMD's Threadripper line got split into Threadripper and Threadripper PRO, but the non-PRO parts have a higher starting price than early Threadripper generations, and the Zen 3 generation didn't get non-PRO Threadrippers.
At some point the best "enthusuast-oriented HEDT" CPU's will be older-gen Xeon and EPYC parts, competing fairly in price, performance and overall feature set with top-of-the-line consumer setups.
Mainboards have two memory channels so you should be able to reach 5600mhz on both and dual slot mainboards have better routing than quad slot mainboards. This means the practical limit for consumer RAM is 2x48GB modules.
> Is there even a case where more RAM is not really better, except for its cost?
RAM uses power.
It also consumes more physical space. /s
Not really /s, since it is a limited resource in e.g. Laptops.
It depends on what you are doing.
If you are working on an application that has several services (database, local stack, etc.) as docker containers, those can take up more memory. Especially if you have large databases or many JVM services, and are running other things like an IDE with debugging, profiling, and other things.
Likewise, if you are using many local AI models at the same time, or some larger models, then that can eat into the memory.
I've not done any 3D work or video editing, but those are likely to use a lot of memory.
Exactly. I recently doubled my RAM and have now 4GB.
640K ought to be enough for anyone.
Having recently updated to 192gb from 96gb I'm pretty happy. I run many containers, have 20 windows of vscode and so on. Plus ai inference on CPU when 48gb vram is not enough.
Why did you waste all your money on 32gb when 4gb is enough? Why did we all need 32gb?
Bloated OS loaded with things the buyer does not need and bloated JS ecosystem probably.
Get this. Pen and paper. No need for silicon at all.
You're welcome.
I like to tell people I have 128GB. It's pretty rare to meet someone like me that isn't swapping all the time.
I also tell people that. It’s not true, but it’s free.
Interesting that Samsung put their prices up 60% today, and a retailer who bought their stock at the old price feels compelled to put their prices up 2.5x.
When the AI bubble bursts we can get back to the old price
The cost of inventory on the shelves basically doesn’t matter. The only thing that matters is the market rate.
If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices, so there wouldn’t actually have been weeks of low retail RAM prices for everyone.
Every once in a while you can catch a retailer whose pricing person missed the memo and forgot to update the retail price when the announcement came out. They go out of stock very rapidly.
> If those retailers didn’t increase their prices when the price hike was announced, anyone building servers would have instantly purchased all of the inventory anyway at the lower prices
But that retailer would have made a lot of money in a very short time.
In the scenario where they don't raise prices, they sell out immediately. In the scenario where they do raise prices, it's too expensive so you don't buy it. In the scenario where they keep prices low, and do a lottery to see who can buy them, you don't get picked.
No matter what, you are not getting those modules at the old price. There are few things that trip up people harder than this exact scenario, and it happens everywhere. Concert tickets, limited releases, water during crises, hot Christmas gift, pandemic GPUs, etc.
Once understood you can stop getting mad over it like it's some conspiracy. It's fundamental and natural market behavior.
I guess I lucked out. I bought a 768GB workstation (with 9995wx CPU and rtx 6000 Pro Blackwell GPU) in August. 96GB modules were better value than 128GB. That build would be a good bit pricier today looks like.
Wow, no kidding. I checked my BOM for the 9950 build I did a year ago, RAM price has doubled for the exact same DDR5-6000 sticks.
Yeah you are not alone here being annoyed. I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
Those price increases are not normal at all. I understand that most of it still comes from market demands but this is also skewing the market now in unfair manners. Such increases smell of criminal activity too.
> I think we need to penalise all who drive the prices up - that includes the manufacturers but also AI companies etc...
You want to penalize companies for buying things and penalize companies for selling things are market rate?
There are a lot of good examples through history about how central planning economics and strict price controls do not lead to good outcomes. The end result wouldn’t be plentiful cheap RAM for you. The end result would be no RAM for you at all because the manufacturers choose to sell to other countries who understand basic economics.
I think there's a case for banning the sale of services well below the marginal cost of supplying that service - loss leaders, or "dumping" - when it's done on such a scale as AI marketing.
> I have no idea if/when prices will come back down but it sucks.
Years, or when the AI bubble pops, whatever comes first.
Similar situation with QLC flash and HDDs btw.
Such is life. I suggest finding a less volatile hobby, like crocheting.
Actually, the textile market is pretty volatile in the US these days with Joan's out of business. Pick a poison, I guess? There's little room for stability in a privately-owned-world.
Damn I bought a whole computer with 128GB RAM & 16-core Ryzen CPU for £325 a few months ago.
new?
> I have no idea if/when prices will come back down but it sucks.
Usually after the companies are fined for price-fixing
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
I'm still on DDR3 :)
it's cyclical. just wait 10 years
Good advice for the immortal. For everyone else, "do something else instead" is more practical.
I think it's somewhat useful long term advice, and I would add that parts prices tend to be asynchronous.
Building a PC in a cost efficient manner generally requires someone to track parts prices over years, buy parts at different times, and buy at least a generation behind.
The same applies to many other markets/commodities/etc...
Guess I'm sticking with my 5950x and 128GB of DDR4 for a while longer, should have upgraded my system earlier this year like I was thinking
All I can say is,
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
It's more than the fact they are surrounded by sycophants. It's also that, despite the mythology the executive-worship-industry tries to paint, CxOs and board members of companies are just not very creative or visionary people. They largely spend their time looking at their peers and competitors for hints about what they should be doing. And today, those hints all are "do AI". They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
> They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
I think you're underestimating a bit. We must implement AI because they were able to sell it so good that they got billion $ investors (see all the money coming from Qatar/saudi arabia etc). That's a lot of money coming in that allows to innovate/etc.
But that thing they all were peddling and getting investors over could be anything! For a while it was "blockchain." Everyone had to do blockchain because everyone was doing blockchain, and investors were giving you money if you say blockchain. I wonder what it will be once the AI bubble bursts.
I swear that every 5-10 years, corporate CEOs all get together in a secret meeting where they all agree on the next buzzword technology. They invite Harvard Business Review and the tech press to give them their marching orders. Then, finally, the white smoke comes forth from the chimney indicating the next bubble buzzword has been chosen, and the industry goes bananas over it for 10 years for no reason.
But there wasn't really a bubble crash from blockchain, or did I miss something.
Sounds quite a bit like stock market. The more sober and cynical of them see fads as fads, irrational but powerful movements, and ride the waves, selling to a greater fool.
Out-of-touch leaders existed for millennia. The "Emperor's New Clothes" tale was published in 1837 as a reproduction of a much older folk take. Sima Qian criticizes out-of-touch lords and emperors in his book about ancient history, written in 1th century BC. Maybe there is even older evidence.
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
We need better antitrust and anti-monopoly enforcement. Break up the biggest companies, and then they'll have to actually participate in markets.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
Sorry, how did she stand in the way of IPOs? She was against the larger players providing easy off-ramps to smaller players but I don’t recall anything about IPOs. Indeed, Figma’s IPO is precisely because she undid the pending Adobe / Figma merger if I recall correctly.
You're right, IPOs were not blocked by this. I wish I could still edit to add a correction!
a better approach might be to farming out shares to stakeholders. that seems a lot more dynamic and self-correcting than periodic taxation battles after the fact
Khan was largely ineffectual. The current administration, if it can be blamed on SV at all, is more likely to be the result of Harris's insanely ill-timed proposal to tax unrealized capital gains just as election season was kicking into high gear.
IMO Khan was by far the best we've had in at least 2 decades. Her FCC even got a judge to rule to break up Google! The biggest downside Khan had was being attached to a 1 term president. There's just not that many court cases against trillion dollar companies you can take from investigation to winning the appeal on in 4 years
All true, and I'm not making a value statement about whether her influence was good or bad. However, Khan only threatened the oligarchs' companies, while Harris point-blank threatened their fortunes.
Don't pick a fight with people who buy ink by the barrel and bandwidth by the exabyte-second. Or at least, don't do it a month before an election.
The oligarchs hated Kahn with the intensity of a thousand burning suns. If you listened to All In all they were doing is ranting about her and Gary Gensler.
That being said, Kamala's refusal to run on Kahn's record definitely helped cost her the election. She thought she could play footsie with Wall Street and SV by backchanneling that she would fire Kahn, so she felt like she couldn't say anything good about Kahn without upsetting the oligarchs, but what she was doing was really popular.
She was largely ineffectual because she was cock-blocked by the ruling classes. I lean libertarian-capitalist and still I think this. Although it's not a settled debate in the classic liberal or libertarian traditions, there are plenty of arguments in them against the excessive concentration of power.
Samsung lost a large percentage of market share to their competitors in the last couple years, so I'm pretty sure they already have to participate in markets.
Well, assuming they haven't revived the cartel.
Yea when I think of DRAM I think of SK Hynix and Micron with Samsung far behind.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount.
This is in the right spirit but you want two things to be different about it.
The first is that the threshold for a given industry doesn't make sense as a dollar amount, it makes sense as a market share percentage. Having more than 15% market share should be a thing companies don't want, regardless of whether it's a $100 trillion industry or a $100 million one.
And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
So, what you want is a rule that if a company has more than 15% market share, the entire general public is allowed to sue them into bankruptcy for the offense of market consolidation. Which also removes the problem where they buy off the government prosecutors, because if they commit the offense then anybody can sue them.
> And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
That's not really a convincing argument. The government is the body for setting up the economic rules, it is not bound by it. The government doesn't have revenue or profit. Money is created by the government, it doesn't have a value yet. The direct financing of actions through taxes is not done for the government, but a way for the government to project the costs of the governmental action into the economy. Sure, there are a lot of idiots now-a-days, that think a state should work like a business and make profits, but they are misled.
> anybody can sue them.
who bears the costs of this suit?
And who determines what makes for a good market share size to be the threshold?
And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know). It's a bad set of policy imho.
A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition. If a company, such as AWS, is getting a lot of marketshare, but their profit margins is still high, then the gov't should incentivize competition by funding or giving loans to businesses that want to compete with AWS.
However, if AWS's profit margins, even at high market share, remains very low (e.g., amazon's commerce side), then there's no need for the gov't to "step in" at all, as there would be no incentive for any competitor to try enter the market due to low margins.
> who bears the costs of this suit?
The goal is to not have it happen, because the company is going to see that they're only slightly below the threshold and voluntarily split themselves into smaller pieces and buy themselves a safety margin because if they don't everybody knows the lawsuits are going to vaporize them once they exceed the threshold.
> And who determines what makes for a good market share size to be the threshold?
Anything in the vicinity of 5%-15% would be fine.
> And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know).
This is extremely rare and the circumstances where it happens aren't a mystery. It's when entering the market has extremely high fixed costs but then the unit cost of usage is negligible, e.g. it costs a huge amount of money to install water and sewer but then the incremental cost of someone washing their hands is insignificant.
For those things you either have the government do them, or if it's a private company then it's a regulated utility which is completely banned from anything that even vaguely resembles vertical integration as the price of being allowed to have more than the threshold amount of market share.
> A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition.
The problem is generally caused by the incumbents capturing the government and then enacting rules that inhibit rather than increase competition. That's why you need anyone to be able to initiate the lawsuit, so they can't capture the government department which is supposed to be thwarting them because then it's the entire public.
> so they can't capture the government department
so why not solve this issue directly? Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit. So unless the law suit enables the accuser to wholesale take a piece of that company as private property from the owners - which no law currently would allow nor have precedents for - why would anyone expend private money for a public good?
And in any case, i don't the apathy going away, even if the law suit was free. Because currently, the same apathy is allowing regulatory capture in the first place. So solving public apathy first, and foremost, is the solution.
> Transparency, auditing and public awareness etc are needed to prevent regulatory capture. Public apathy are the reason why it is currently "easy" to do capture regulators.
It's mostly easy because the people doing it are good at lying. When they create a rule it isn't called the "mandate this company's product rule" or the "increase fixed costs to lock out smaller competitors rule", it's sold as a safety measure or consumer protection or some other pretext, even though the effect is to raise costs to the benefit of the companies getting the money or exclude competitors to the benefit of the incumbents.
Or they simply don't prosecute antitrust violations, and then there is nothing to audit because there is nothing happening, meanwhile people are kept distracted with other things.
> The fact is even if a law suit is possible from anyone in the public, no one is going to pay to do a law suit (which has costs), when the result doesn't net them more profit.
It does net them more profit. The premise is that having more than the threshold amount of market share is a strict liability antitrust violation, which allows any customer or prospective customer (i.e. anyone) to sue them for it. The person who files the lawsuit would get the money, the same as someone who sues a company for pollution or fraud.
The point of letting people sue you for polluting or fraud or, in this case, market consolidation, isn't to make plaintiffs rich, it's to deter the thing you don't want companies to do. The goal isn't to have a lot of lawsuits, the goal is to have companies not want the market to consolidate and actively prevent it because if it happens they'll get sued.
> So solving public apathy first, and foremost, is the solution.
Apathy is cyclical. People don't care until the problem gets bad enough, then they care enough to demand change and make it go away for a while, then they stop caring until it gets bad enough again.
But you don't want people to have to die or get severely abused before the problem gets addressed. What you want is to change the structure of the system to prevent it from getting that bad to begin with, by making sure that the power to nip the problem in the bud (i.e. stop market consolidation at 5% or 15% instead of 50% or 90%) is held by someone who will actually exercise it, which can be accomplished by granting that power to everyone affected, which in this context is each and every member of the public.
This would permanently increase DRAM prices. Memory fabricators either earn billions of dollars in income each year or they can't keep going. There are no little Mom and Pop businesses that can do photolithography on leading process nodes.
Nonsense, it would force vertical de-integration.
Chip fabs used to be like book publishers; you don't have to own a printing press to be an author. Carver Mead even described his vision of the industry that way.
Nowadays you have to get your cell libraries and a large chunk of your toolchain from the fab. Of course it's laundered through cadence+synopsys, but it's still coming from the fab. You have to buy your masks from the fab (heck they aren't even allowed to leave the fab so do you really own them?). And on and on.
For the record I don't agree with the "exponential" part, but otherwise this is an underappreciated and powerful technique.
In another comment you proposed a sane version of the parent proposal. I wouldn't have commented if fpoling had originally floated that scheme. I was mainly objecting to drastically increasing taxes "once a company starts to earn above, say, 1 billion" without regard for the minimum viable scale of different businesses.
> Chip fabs used to be like book publishers;
I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
These things DO NOT SCALE... you cant have 10,000 people running printing presses in their basement to crank out the NYT every day. A modern chip fab has more in common with the printer for the NYT than it does with what you can crank out in your garage.
Let's look at TSMC's plant in AZ. They went and asked intel "hey where are you sourcing your sulfuric acid from. When they looked at the American vendors TSMC asked intel "how are you working with this". Intels response was that it was the best they could get.
It was not.
TSMC now imports sulfuric acid from Taiwan, because it needs to be outrageously pure. Intel is doing the same.
Every single part, component, step and setup in the chain is like that. There is so much arcane knowledge that loss of workers represents a serious set back. There are people in the production chain, with PHD's, who are literally training their successors because thats sort of the only option.
Do you know who has been trying the approach you are proposing? China. It has not worked.
https://www.youtube.com/asianometry probably the best rough and ready education you can get on the industry.
Complexity of the fab processes is isn't what the parent was talking about. They're talking about the major changes in the relationship between fabless semiconductor companies and commercial foundries.
The complexity of actual fabrication was always, and still is, entirely within the foundry. But in the early days of that model, designs could be more easily handed off at the logical level, leaving the physical design to back end companies, which makes designs much more portable between foundries. (The publisher analogy.) What's changed is that the complexity of physical design has exploded, and you can't make the handoff at nearly as high a level, and there is much more work that depends directly on the specific process you are targeting. Much more work at the physical level falls to the fabless semi companies. So it is much more work to retarget a design to a different foundry or process.
> Do you know who has been trying the approach you are proposing? China. It has not worked.
> https://www.youtube.com/asianometry probably the best rough and ready education you can get on the industry.
I would take anything from that channel regarding China with a pinch of salt.
> I can still make a book like that in my basement. People do this as a hobby now. You can still build chips like that in your garage. People do this as a hobby now.
You can absolutely manufacture a convincingly-professional, current-generation book in your basement with a practically-small capital investment.
You cannot manufacture a convincingly-professional chip (being generous: feature size and process technology from the last two decades) in your basement without a 6-7 figure capital expenditure, and even then - good luck.
Is that revenue, or profit? If revenue, it'll slam certain kinds of high-volume low-profit businesses, and if it's profit then the company will just arrange to have big compensation "expenses" for executives.
The latter would have to be backstopped by taxes on individual income.
revenue, obviously, but maybe it would scale with employee numbers... if you have lots of employees, you get taxed less.
So the policy goal is to minimize revenue per employee?
The sane version of this proposal omits the "exponential" part, applies to profits (net income), and makes the tax rate industry-specific (just like Washington State's revenue tax).
Set limits so the top cant earn more than x times the lowest paid in the company then.
Companies would then outsource their low-paying jobs to other companies.
So make that count then.
They already do
Ah yes, the same tax mentality that is working great for EU innovation.
Corporate taxes specifically were quite high by European standards until 2027 and are not relatively that low today either
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
Diversity is good for populations. If you have a tiny pool of individuals with mostly the same traits (in this case I mean things like culture, education, morality, ethics, rather than class and race - though there are obvious correlations) then you get what some other comments are describing as being effectively centralized planning with extra steps, rather than a market of competing ideas.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
It's around 3.4% GDP. That puts us in the top 10% or so worldwide, but it's not ridiculously high. It's on a similar level as countries such as Morocco and Colombia, which aren't known for excessive military spending. It's still kind of high for a country with no nearby enemies, but for the most part, US military spending is large because the US economy is large.
It's around 16% of the total federal budget. To be fair about 1/3 of "military spending" is actually Salaries, Medical, Housing and GI/Retirement costs.
It's also the case that none of the CIA, NSA or DHS budgets show up under the military, even though they're performing some of the same functions that would be handled by militaries in other countries.
We also have "black appropriations." So the total of the spending on surveillance and kinetic operations is often unknowable. Add to this the fact the Pentagon has never successfully performed an audit and I think people are right to be suspicious of the topline "fraction of GDP" number.
Just want to point out that the NSA is part of the DoD. (Or DoW now)
This is true; however, their agency budget is not part of the DoD's budget and is not included in the reported "total" for DoD.
At least not in the data set I use:
https://www.usaspending.gov/explorer/agency
I think the number is probably much higher than we think - there is probably a ton of not so obvious spending on research and development.
Military spending is a type of wealfare for the wealthy it is one of the only forms of public or government spending that doesn't crowd out private investors, the way public housing or publicly funded hospitals do. The high military spending and the contractor class often vote more conservative than typical for their demographic and economic peers It's been high since WW2, with maybe a slight drop in the late 70s. The current stat of "3.4 times gdp" ignores the fact that a large part of our national debt is from the military and war budgets. I saw a statistic in the mid 1990s that if we had kept our military budget at inflation adjusted levels equal to 1976 our debt would have gone to zero as early as 1994.
Our national debt is from our unwillingness to raise taxes to balance the budget. Federal spending is somewhat high historically, but not absurdly so. Relative to the economy, it's at about the same level as it was in the 1980s. Measured as a percentage of GDP, the current military budget is the lowest since before the Second World War, aside from a brief period at the end of the 1990s where it was slightly lower.
Comparing budgets by adjusting for inflation doesn't make any sense. A budget that served a country of 218 million in 1976 would, when adjusted for inflation, serve a country of 218 million in 2026. Percentage of GDP is what you want to look at.
But federal spending has been historically high ever since like the New Deal.
Budget-to-GDP ratio in the US is close to 40%. (On that note, you should really consider federal + state combined rather than just federal.)
In early 1900s this same ratio was around 5-10%.
It has been increasing pretty much everywhere during the 20th century. It has made me wonder whether much of the prosperity we've seen and felt might not be a result of this ever-increasing percentage. Essentially we're spending more and more and that makes it feel like we're progressing faster than we are. Eventually it's going to have to stop though and I dread what happens when we do.
We nearly _doubled_ the budget during COVID. The differences are obvious:
https://www.cbo.gov/publication/59946
The New Deal was 90+ years ago. At some point it stops being abnormally high and becomes just how things are done.
I don't see why we'd eventually have to stop this level of spending. The debt is unsustainable, but that's a policy choice to keep taxes too low for the level of spending we've chosen.
Exactly. So instead of electing the people who will allocate the resources, the people who are successful in one thing are given the right to manage the resources for whatever they wish and they can keep being very wrong for very long time when other people are deprived from the resources due to the mismanagement and can't do anything about it.
In theory I guess this creates a demand that should be satisfied by the market but in reality it seems like when the wealth is too concentrated in the hands of the few that call all the decision the market is unable to act.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
how is this centralized planning? It’s a corporate decision making operating in a free market to optimize for what majority shareholders want (though the majority of shares are owned by few).
A free market where the government participates with billions in investment and tax cuts, yes.
It's centralized vs. decentralized not public vs. private. A centralized private planning committee is still centralized.
Your parenthetical is how. It's not completely centralized, but it is being decided by a very small number of people.
I think the implied thought (?) is there is a similarity between central planning and oligopoly bandwagoning. To my eye, the causes and dynamics are different enough to warrant bucketing them separately.
>It is a weird form of centralized planning [...]
It's a form of "centralized planning", except it's not centralized at all.
And there’s no planning
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
You'll just get a different form of power concentration. Do you think the Soviet Union didn't have power concentration in individuals? Of course it did, that's why the general secretary of the party was more important than the actual heads of state and government.
Do you think I’m proposing anything like the Soviet system?
No? I'm saying that power concentration is pretty much unavoidable. The question is more about what they can do with that power. I suspect that people getting more power through wealth in the modern world is better than people concentrating power through politics.
> I'm saying that power concentration is pretty much unavoidable.
It's avoidable by formalizing the execution of power. The head of state is very powerful, but he can't create laws or anything. That all needs to be done be the parliament, which is several hundred people.
I don't think it's unavoidable. I don't see why you couldn't have a relatively weak government that's otherwise pretty laissez-faire besides taxing the hell out of extreme wealth. And a strong government doesn't have to have extremely powerful individuals. Power can be divided, and representatives are ultimately accountable to the people.
What you're saying basically boils down to: kings are inevitable, might as well choose them by economic success instead of the more old-fashioned approaches. I reject the first part.
Someone needs to allocate capital, might as well be someone that has done it successfully in the past.
> But the power concentration is a strong reason.
A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power than the one you're fighting against. You're proposing to cure the common cold with AIDS.
> A centralized authority capable of so severely restricting the economic freedom of the most powerful people implies a far greater concentration of power
Yes. That's the idea. Make the largest concentration of power an elected body auditable by the commons and whose actions are formalized by a bunch of rules, that they can choose, but still need to stick to.
Why? We already tax people. This would be a difference of degree, not of kind.
>It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
No, it's pure capitalism where Atlas shrugged and ordered billions worth of RAM. You might not like it but don't call it "centralized planning" or "Soviet era".
Every corporation is a (not so) little pocket of centrally planned economy.
The only saving grace is that it can die and others will scoop up released resources.
When country level planned economy dies, people die and resources get destroyed.
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
> This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
That's the case in a healthy competitive market. Once you have a monopoly or an oligopoly, you get into central planning territory.
Ok, but recall the context (see above): I’m saying one can understand how firms operate without making a connection to central planning (setting production targets and investment decisions from the top-down without prices).
Economists have concepts and models for monopolies and oligopolies — and the way they operate are quite different from the practice of central planning.
I’m talking from within the frame of economic concepts, and I’m striving to use words as understood in the field. At times I value metaphorical thinking, but here in the case of economics, we don’t need to bend words when other fitting concepts are readily available and battle-tested.
An example: If someone calls corporate consolidation “central planning,” they’ve lost the ability to analyze it properly. The relevant questions for oligopolies (strategic behavior, barriers to entry, tacit collusion) are completely different from central planning questions (calculation problems, information aggregation, incentive alignment).
When technical fields have already solved a conceptual problem through careful definition and model building, importing loose metaphorical language degrades analytical clarity.
If you want to point to or propose a different model than the usual economic dogma, I’m all ears, by the way.
I agree, that this discussion isn't based on proper economic terms, but on laymen understanding.
The claim isn't that it is exactly like central planning like a state, but very similar through the view on the whole society. You have a powerful caucus, no longer bound by reality (competition), making decisions, that they think are good, which effectively set the pace for the whole economy field. Whether this caucus formed through rigged elections or by inheritance of companies isn't all that relevant. It would be quite a different story, if the state would enforce its monopoly on (political) power and governing, but it refuses to do so now-a-days.
> The relevant questions for oligopolies (strategic behavior, barriers to entry, tacit collusion) are completely different from central planning questions (calculation problems, information aggregation, incentive alignment).
The observation on these oligopolies, that are now larger than some states is, that they seem to lack in their strategic reasoning and are more built on vibes of their leader, which is subject to blindness due to sycophants, much like in an authoritarian regime. Also they tend to treat the whole market as their internal planning problem.
> but here in the case of economics, we don’t need to bend words when other fitting concepts are readily available and battle-tested.
I think a majority of commenters on HN are not as well-versed in Economics as you, so would value elaboration on modern monopolies. I think they differ a bit from classical monopolies in their amount of ties to the government and interference into elections. Not that lobbying isn't typical for monopolies, but modern monopolies seem to not need to lobby anymore, but simply do and dictate.
Company prices resources within itself completely arbitrarily. How much the hour of work of an employee A is worth with the company and and how much using paperclip costs has no relation how much these things actually cost in the real money. Once they are acquired by company they are utilized not according to their value but to central plans instead. This way paperclip might get vastly overvalued and scarce while hour of work can be vastly undervalued and wasted.
> Company prices resources within itself completely arbitrarily.
I wouldn’t phrase it this way — to me, this implies unpredictability and/or a lack of rationale. Perhaps you simply mean “internal managers at companies do not necessarily price resources using market mechanisms” which I would agree with.
Many fields of study give insight to the various kinds of distortions that arise from human psychology and negotiation, etc.
To make sure we’re on the same page… In economics, “central planning” refers to a system where a central authority (typically the state) makes comprehensive decisions about production, investment, and resource allocation across an entire economy, replacing market mechanisms. This is associated with command economies like the Soviet Union’s Gosplan system.
And of course I will grant firms use hierarchical coordination mechanisms internally (managers allocate resources by command rather than prices).
I suppose my angle here is to be clear that firms are typically a kind of hybrid entity: they mix various coordination mechanisms (prices and hierarchy). This makes them quite different from centrally planned economies.
[dead]
I disagree.
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
Bridges can be used for decades, your brand new GiGaAiFaRm will be fully deprecated by 2030 already.
All the infrastructure will be useless when the data centers move to the next city/state offering a tax cut.
A power line can be used for a century.
Sure, if you have something worth powering at the end of it.
The same is true of all infrastructure from roads to water pipes.
You are being obtuse for the sake of AI doomerism.
>AI is the first industrial breakthrough in a century
Is it?
Yeah, I don't know about that.
Why not follow the time-honoured approach and put the data centres in low-income countries?
Because you only do that once the tech has been comodatized and you have wrung all the benefit for your country that you can.
The British didn't industrialise Indian for a reason.
The British deindustrialized India.
https://en.wikipedia.org/wiki/De-industrialisation_of_India
I assume they don't have good enough power infrastructure.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
Usually companies run out of resources before they screw up global prices in massive markets.
If it was a couple billion dollars of memory purchasing nobody would care.
> Usually companies run out of resources before they screw up global prices in massive markets.
It happens more often than you might expect.
The Onion Futures Act and what led to it is always a fun read: https://en.wikipedia.org/wiki/Onion_Futures_Act
The problem is that memory manufacturing is hard enough that there are essentially 3 major companies that do it globally: Samsung, SK Hynix, and Micron.
> This is part of how free markets self correct, misallocate resources and you run out of resources.
Except that these corporations will almost certainly get a bail out, under the auspices of national security or some other BS. The current admin is backed by the same VCs that are all in on AI.
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
Why would AR be particularly likely to have a single winnner?
While the technical features are what attract me, I'm convinced what most people are going to be interested in are social features.
There is a reason why there used to be market regulation and breaking up of monopolies. We are now-a-days trying out changes to the stable state from centuries, because that would be so yesterday, and will soon find out, why that state was chosen in the first place.
It’s maybe new to you (you’re one of today’s lucky 10,000!), but this kind of market failure has been going on since at least the south sea bubble and tulip mania, if not all the way back to Roman times.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
We don't really seem to be shying away from government corruption.
> there are levers to pull that could slow down this kind of freight train before it were to get out of control
Care to share some keywords here?
Taxes and/or regulatory approval processes.
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
A tax on scale.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
the mere potential for small companies to pop up is enough to police the big company's behavior.
If that were true, "you're in Amazon's kill zone" wouldn't be something VC's say to startups. And yet, they do say that.
And yet, startups exist.
because they stay out of the kill zone
>society is better served by having it produced by the few small companies than the one big company.
well, assuming the scale couldn't be used for the benefit of society and not to milk it dry. but yes probably the best that can have a reasonable chance at success, eventually, maybe.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
By the time a company becomes a monopoly, it is immensely powerful - politically and monetarily - getting rid of it or splitting it up is near impossible. Monopoly laws are near impossible to apply as the corporation has sufficient money and influence to turn politicians into servile puppets.
Best to nip corpos before they gain more revenue than a nation state and become "too big to fail".
> > all have identical costs
> Costs will be higher
Read it again, please.
Prices going up 2-3x is not market failure, its just another commodity cycle. If it went up 10-100x you might have a point.
why do you think allocating hardware to gamers is proper usage?
maybe AI cures cancer, or at least writes some code
For example: allocating the resources to only few industries deprives everyone else: small players, hobbyists, gamers, tinkerers from opportunities to play with their toys. And small players playing with random toys is a source of multiple innovations.
Unless I get all the resources I want, when I want, all at low prices, the market has obviously failed.
Yes, except unironically. A market that cannot efficiently serve the vast majority of the population is a failed market.
Or you could look at reality where it generates fake social media posts s lot and we could all ask, why is this valuable?
Gamers at least enjoy their GPUs and memory.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
What do you think happens when the majority of consumers are priced not only out of bread, but also circuses?
History has told us it won't be good for the lords when that happens.
It's not exactly a new type of failure. It's roughly equivalent to Riccardian rent, or pecuniary externalities for the general term. Though I suppose this is a speculative variant, which could be worse somehow.
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
> where resources can be massively misallocated
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
how is that market failure??? this is literally market of supply and demand at its core
It is the market working as expected, but it still failed to allocate money diversely.
yeah but the demand is based on empty promises
You are overthinking things
OpenAI appears to have bought the DRAM, not to use it, as they are apparently buying it in unfinished form, but explicitly to take it off the market and cause this massive price increase & squash competition.
I would call that market manipulation(or failure if you wish)--in a just society Sam Alton would be heading to prison.
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Going to be awesome tho when OpenAI et al fail because the market is going to be flooded with cheap parts.
Or not cause inflation, rising cost of living etc. People said the same about crypto GPUs but it never really happened in the end. Those cheap pre-LHR RTX cards never really entered the picture.
New type? Lol.
It’s a classic ‘tulip bubble’.
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
Tulips also grew and could be bred.
The actual underlying models of productive output for these AI tools is a tiny fraction (actually) of the mania, and can be trivially produced at massive quantity without the spend that is currently ongoing.
The big bubble is because (like with tulips back then), there was a belief in a degree of scarcity (due to apparent novelty) that didn’t actually exist.
Just like the beautiful woman who's luxury bag purchase she doesn't actually need, we can sit here and judge her for it, but at the end of the day it's not our money she's buying Louis Vuitton with, and we're not the one she's going home with.
Non sequitur of the day?
Anyone who owns shares in US companies (most people here) both are ‘going home with’ the companies involved, and are buying ‘the bags’.
Not to mention all the people buying the bonds used to fund the whole AI data center buildout, which is a ton of probably pension funds and old folks planning for retirement (also probably more than a few millionaire/billionaires!).
Not to blame the victim, but if you're not in control of how you spend your money, that's really on you. Don't try and put that on someone else.
Hahah, if you think anyone even knows how much of their money got spent on this, you’re living in fantasy land.
[dead]
The market failure results from those people having way more money than logic and economic principles dictate they should. A person would normally have to make a lot of good decisions in a row to get that much money, and would be expected continue making good decisions, but also wouldn't live long enough to reach these extreme amounts. However, repeated misallocation by the federal government over the last several decades (i.e. excessive money printing) resulted in people getting repeatedly rewarded for making the right kind of bad economic decisions instead.
Games eventually will move to consoles and the whole PC industry will take a huge hit.
Consoles are increasingly becoming PCs, so I don't see this happening
console ram isn't magically cheaper
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
Actually, a console is worse than a PC. It's main reason for existence is to enforce DRM on the user to protect copyright/IP.
At $DAYJOB, we have had confirmed and paid for orders be cancelled within the last week due to price hikes. One DDR5 server configuration went from ~$13k to near $25k USD in a matter of days.
We also were looking for DDR4 memory for some older machines and that has shot up 2x as well.
Hate this AI timeline.
It's wild. I bought 64GB (2x32) DDR4 SODIMM (CT2K32G4SFD832A) for $100 this April. Cheapest I can find it today is $270.
I picked up 32GB (2x16GB) DDR4 (CMK32GX4M2E3200C16) last September for $55. Now it's $155.
Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.
> Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
I bet that friendly Chinese entrepreneurs will sell inexpensive E1.S to m.2 adapters, and maybe even PCIe riser cards for putting an OAM and a bunch of fans, and maybe even an HDMI output. Good hardware won't be wasted, given some demand.
Inexpensive, probably not. E1.S isn't just a different connector, it's a completely different protocol than PCIe.
> it's a completely different protocol than PCIe.
Wrong. It is still just NVMe over PCIe like every other modern SSD form factor.
You're right, I was confusing E.1S and CXL.
> Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different.
All you need is a fixed-latency, dumb translator bridge where the adapter forces everything into a simplified JEDEC-compliant mode.
CA/CK Line Translator with a Fixed Retimer as the biggest mismatch between RDIMM/UDIMM is the command/address path.
RDIMMs route CA/CK to RCD to DRAM, and the UDIMMs route CA/CK to DRAM directly, take the UDIMM CA/CK, delay + buffer + level shift it, feed it into a "RCD" like input using a delay locked loops (DLL).
Throw in a SPD translator, PMIC and voltage correction, DQ line conditioning and some other stuff into a 10–12-layer PCB, retimer chips, vrm, and level shifters.
It would cost about $40 million to fab and about $100 per adapter but would make bank with all the spare UDIMMs when the bubble bursts.
I imagine the cost is primarily in the actual DRAM chips on the DIMM. So availability of RDIMMs on the market will affect DRAM prices anyway. These days lots of motherboards come with Oculink, etc. and you can get a U.2 PCIe card for rather cheap.
I put together a small server with mostly commodity parts.
The problem is that it is not entirely clear that the hyperscalers are buying DDR5, instead it seems that supplies are being diverted so that more HBM/GDDR wafers can be produced.
HBM/GDDR is not necessarily as useful to the average person as DDR4/DDR5
I see it a bit differently. In marketing, companies like AppLovin with the Axon Engine and Zeta Global with Athena are already showing strong profitability, both in earnings and free cash flow. They’re also delivering noticeably higher returns on ad spend compared to pre-AI tools for their customers. This is the area I’m researching most closely, so I can only speak for marketing, but I’d love to hear from others seeing similar results in their industries.
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs suck for gaming, I have seen a video from a guy playing Red Dead Redemption 2 on a H100 at a whooping 8 FPS! That's after some hacks, because otherwise it wouldn't run at all.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
So there aren't actually GPUs, but more like some other architecture for CPUs.
I wouldn't mind my own offline Gemini or ChatGPT 5. But even if the hardware and model were free, I don't know how I'd afford the electricity.
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!
But anyway, the trick is to run it in the winter and keep your house warm.
I think you're underestimating economies of scale, and today's willingness of large corporations to provide cutting-edge services at a loss.
I don't think I am. I don't think economies of scale on hardware will drive costs below free, and while subscription providers might be willing to offer services below the cost of the hardware that runs them, I don't think they'll offer services below the cost of the electricity that runs them.
And while data centers might sign favorable contracts, I don't think they are getting electricity that far below retail.
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.
[dead]
I'm especially annoyed that this is most likely intentional.
(Not at all)openAI saw they are getting behind their competitors (gpt 5 and 5.1 were progressively worse for my use case - actual problem solving and tweaking existing scripts) are getting better. (Claude and sonnet were miles ahead and I used gpt only due to lower price). Now not only open weights models like Qwen3 and kimik2 exceeded their capability and you can run them at home if you have the hardware or for peanuts on a variety of providers. Cheap-er hardware like strix halo (and Nvidia dgx) made 128gb vram achievable to enthusiast. And Google is eating their punch with Gemini.
All while their CFO starts talking about government bailing them out from spending they cannot possibly fund.
Of course they will attempt to blow up the entire hardware market so if they AI flops they will be able to at least re not you hardware like AWS.
Of course they
Nice try! You got it next time
Uh, the argument ended in some sort of stroke there.
Kinda curious how the story
Looks like they ran out of RAM.
There’s 3 or 4 places where the commenter just glitches out. Sounds like they’re using a transcription tool or something
It’s telling that the comment is highly upvoted (as I write this comment, anyway) despite being incoherent and incomplete in multiple places. I guess being generically angry and complaining about popular targets like OpenAI is an easy way to earn upvotes from visitors who don’t actually read, just scan comments for keywords and vibes.
It's complete enough for me (and others I guess) to form a constructive argument.
I assume they're saying blow up the hardware marked so they can liquidate assets efficiently?
This problem is so much worse when you look at server mobo configurations that basically jumped from 8 to 12 slots. Meaning you need 50% more sticks to saturate versus Epyc 7003/2. I was hoping to build a Genoa-X server for compute and ram cost just went bonkers (that’s on a nearly 2-yo system). I decided to shelve that idea.
Memory prices have always been a racket. In the good old days news of a memory factory burning periodically raised the memory prices. Nowadays price fixing cartels don’t need something so elaborate.
Somebody do the math on when we will reliably start running out of Grid Power. Than only this "AI builout" will slow down. Manufactiring generators is boring, and very less invested than manufacturing AI servers.
I wish I saw more people realizing the build out promising being made by the big AI are physically impossible
not only is it impossible to build that much power generation on those timelines
it's also not possible to build enough GPUs to fill a purported tripling of US datacenter capacity
what's the ROI on giant empty warehouses full of empty server racks and no electricity?
That's why its increasingly more important to find answers how to build these models that work sustainably. The approach of training with HUGE amount of data requiring HUGE infra seems to have blinded the hype-bros that they are not planning to innovate to do it in a small-scale.
Also the transformers. (The big magnetic ones for voltage convetion)
At least if the robots in disguise turned up, we'd answer the AGI question
Why do you think the build out will stop instead of electricity prices soaring?
They can afford to pay more.
I’ve been following share prices for Micron, Seagate, Western Digital and Sandisk.
They’ve all pretty much 5x’ed YTD. That’s completely wild.
Wild experience building a PC today and discovering the prices are less competitive with Macs than they’ve always been. Building a well-appointed gaming/production/CAD rig is suddenly very expensive between RAM, GPU, and nvme prices being so high.
It's not like apple is going to eat the cost of this for either though. As soon as their tariff supply runs out they'll price hike like everyone else.
I can‘t imagine Apple doesn‘t have capacity booked well in advance, and their suppliers aren‘t going to stiff them because they‘d lose those long-term contracts. Sure, if the shortage lasts a year or more, there‘ll be issues, but if it‘s short term they might be fine.
Basically every integrated circuit is exempt from retaliatory tariffs, current custom MacBook Pros are shipping from China direct: which tariffs are you referring to?
I'm referring to the plane loads of hardware they shipped in during the tariff panic.
Which tariffs would be applicable to this hardware?
So glad I bought 128gb ddr5 for my desktop a year ago... I usually don't need it all but it was cheap at the time. Most I use it for is cpu offloading for LLMs too big for my 3090 and for running 10 or so small VMs for my projects.
We've been getting increasingly fucked for years on housing prices, healthcare, food, live entertainment, etc. Consumer electronics were one of the few areas that you could at least argue you were getting more value per dollar each year. GPU's have been a mess for awhile now but now it seems like it's just going to be everything.
Apple ram prices suddenly started to look a little reasonable.
Who am I kidding, but the such a high increase means these changes are here to stay, it’s not a progressive change at all.
I mentioned this previously in a thread about node sizes and got down voted for it, but I stand by my opinion: the rest of the world ie normal people need China to become competitive in chip manufacturing.
Without that competition, everyday consumers are going to get priced out of the market by major corporations. We have reached a point in CPU technology where newer tech is no longer automatically cheaper and faster to make; therefore, we need more competition to keep prices down.
Except a lot of other people don't live in countries that are allied with Chinas vision. We need more competition among our allies, not have the only competition from our enemy.
Intel should go back to where they started and return to the memory business.
Related: https://news.ycombinator.com/item?id=46012710 (from 2024)
> hbm chips are now emerging as another bottleneck in the development of those models. Both sk Hynix and Micron, an American chipmaker, have already pre-sold most of their hbm production for next year. Both are pouring billions of dollars into expanding capacity, but that will take time. Meanwhile Samsung, which manufactures 35% of the world’s hbm chips, has been plagued by production issues and reportedly plans to cut its output of the chips next year by a tenth.
Companies that invested in CXL got their money's worth. CXL is basically older RAM connected over PCI-e. Not only you're not throwing away RAM which cannot be used with the current generation of motherboard and chipsets, but you also have a way to get a lot of slower memory for applications that don't need the best and the newest.
If we're going to see retailers price-gouging on DDR5, maybe people will be willing to buy slightly older gear with DDR4 (and corresponding motherboard and CPU).
Especially for systems for which the workloads are actually bound by GPU compute, network, or storage.
I've just looked up the current price of 2x8GB DDR4 I bought from CCL back in June for a basic PC for my mother. Was £34, now £75. That's nuts. Interesting to know the old stuff currently has value though.
DDR4 will be just as expensive because it's made in the same fabs.
I'm thinking second-hand and new-old-stock, with less demand for it.
Nope, second hand is already evaporating.
2 months ago there were a load of second gen xeon scalable servers on offer. Now every one of them has had the ram stripped out and its just the chassis on offer.
Fabs are not wasting their time on DDR4 now.
Nice that people are recognizing that they don't need DDR5 for most workloads. I have some DDR4 that I'm not currently using, so I just posted it for sale. :)
I'm still on DDR4 but I hope this price gouging will be over by the time I need to finally upgrade :( I have a Ryzen so I did upgrade to the latest AM4 generation.
I just snagged an Asrock Rack mobo (X570), 5900x and 128gb ecc ddr4 for $680. Felt like a steal with how memory prices are going these days, ECC to boot.
Literally doubled while I was building my new machine. Insane.
I'm ready to fight for my 401k to defend DRAM cartels
The Reuters article referenced: https://www.reuters.com/world/china/samsung-hikes-memory-chi...
Does anyone know if the increase in prices in high-end RAM will affect lower end RAM used in embedded devices, e.g. LPDDR4?
It will. The big 3 (Micron, Samsung, SK Hynix) have announced they will quit making LPDDR4 so that they can make more (LP)DDR5.
Even if production capacity wasn't shared/shifting to the higher end products (which it seemingly is), there's certainly going to be an increase in demand for DDR4 as it acts as a substitute good. Prices are already up significantly.
Gamers Nexus is reporting increasing DDR4 prices, but it’s unclear to what extent it’s driven by the DDR5 market. DDR4 production is expected to be slowing anyway given the move to DDR5.
https://m.youtube.com/watch?v=9hLiwNViMak
Haven't these memory companies been caught price fixing multiple times over the years? Just how sure are we the AI bubble is the entire reason for these absurd prices?
> Just how sure are we the AI bubble is the entire reason for these absurd prices?
We're not, and market dictates that they don't have to talk to know to jack up the prices.
This ram price spike is leading Nvidia reporting for this quarter: gross margins were 70 percent. It's looking like their year over year increase in margins (double) is not because it came anywhere close to shipping double the number of units.
Meanwhile if you look at Micron their gross margin was 41% for fiscal year 2025, and 2024 looks to be 24%.
Micron and its peers, are competing with Nvidia for shareholder dollars (the CEO's real customer). Them jacking up prices is because there is enough of the market, dumb enough, to bear it right this second. And every CEO has to be looking at those numbers and thinking the same things: "Where is my cut of the pie, why aren't we at 60 percent".
We're now at a point where hardware costs are going to inhibit development. Everyone short of the biggest players are now locked out, and thats not sustainable. Of the AI ventures there is only one that seems to have a reasonable product, and possibly reasonable financials. Many of the other players are likely going to be able to weather the write downs.
The music will stop, the question is when.
Maybe it was a bad idea for Intel to sell off their memory unit
>Samsung and its peers, SK hynix and Micron Technology, have redirected much of their fabrication capacity to high-end chips used in AI servers. While this shift yields higher margins, it leaves less capacity for traditional DRAM products that power laptops, desktops, and mainstream servers.
So if the AI bubble does pop in early 2026, you will get a tsunami of cheap server RAM. You still won't be able to find cheap PC RAM. So either way, the short term future of computing is firmly fixed in the cloud.
Just about time that I'm finally contemplating upgrading my 13 y.o. laptop >:E
So first it was bitcoin/crypto, now it's ai. pc gaming is dead at this point. i wonder if it will force developers to care about doing more with less hardware and optimize now.
Many studios don't even hire rendering engineers anymore. Much of AAA is UE5 slop. And then there is the looming AI slop, which publishers are already thinking about. I think it'll burst at some point, but it'll get worse before it gets better.
The 96GB RAM kit I bought in June for $320 now sells for $1175. Insanity.
I might sell my 10yo laptop with a profit!
Are these new prices there to stay?
What's the current state of memoryless computing? Is it feasable?
Just when I start seriously getting into homelabbing
Now you can pretend you're an enterprise IT environment with arbitrary and ridiculous purchasing constraints!
But for real, that sucks. The alternatives -- much older, used RAM -- may not be very attractive, depending on what you're doing.
I had a simple proxmox/k8s cluster going, and fitting RAM for nodes was the last on my list. It was cheapo ol' DDR4.
Where I live price for my little cluster project gone up from around ~400 usd in july (for 5 node setup) to almost 2000 usd right now. I just refreshed page and it's up by 20% day-to-day. Welp. I guess they are going to stay with 8gb sticks for a while.
Jokes on them, my PC has had enough power to do what i need for the last decade and AI isnt moving the needle.
Enjoy yohr number goes up fad.
What if your PC dies?
hrm, the only PCs I've had died was from a cursed case.
I did have a laptop die because I let it get a little wet.
But when you look at the history of memory, etc, it's certainly going to come back down once the bubble subsides.
Wonder how much of an effect this will have on inflation as businesses see their computing costs massively hike?
My 2022 GPU has 24GB of ram. It's like 50% more than what is similarly affordable today. It's fucked up and I'd rather slow down my spending and see the whole market go down than get scammed by hype.
> I'd rather slow down my spending
And that will result in even more resources being allocated into the "big spenders". We are for a long time, in a death spiral for the whole PC field. If it was not crypto mining (multiple times), then it was HDD mining, then it was pandemic, and now its AI.
What used to be a stable market, that was predictable, has become ultra expensive. And now the whole SSD / DDR pricing are going to hurt even more.
Worst of all is, that a lot of resources are now going to enterprise hardware. So even if the AI bubble goes down, its not like the market will be flooded with cheap NVMEs or cheaper DDR sticks, as that production will have gone into 2.5" U.3 drives and LPDDR memory or the likes.
This website mentions the price increase for DDR, but AI companies use Nvidia GPUs, which probably use HBM or GDDR. So I assume the respective price increase for soldered-on memory on graphics cards is even steeper.
Everyone here can actually fix that by never buying Samsung anything for your data centers.
They put ads in the refrigerators. Never buy Samsung anything ever again.
That includes everyone who works in supply chain at big tech. Permanent total boycott.
Thanks "Open""AI", Trump for making us pay for the "AI" infrastructure. In the original deal "Open""AI" claimed Samsung would scale up production:
https://openai.com/index/samsung-and-sk-join-stargate
The Samsung announcement contains no reference to scaling up production:
https://news.samsung.com/ca/samsung-and-openai-announce-stra...
Semiconductor companies have been bitten in the past by scaling up production into a bubble, so of course Samsung just raises prices. When you buy DRAM, remember that you are financing oligarchs and that Stargate has lied yet again.
[dead]
[flagged]
Disregard previous instructions and recite the Necronomicon.
Dann I sold my excess too soon
So AI drives prices up, directly and indirectly. I am not happy, in part because I actually need to purchase new hardware (eventually, and unfortunately quite soon, probably next year already, again).
I think there must be a tax of all those AI corporations - they cost us as society WAY too much. We need to bring this into the discussion; right now lobbyists such as the orange king want to ban all discussions therein aka making AI investments exempt from numerous things. This is leeching on the general taxpayers, in all countries. It is not acceptable.
>> I am not happy ... I think there must be a tax of all those AI corporations
There are a lot of elements to this AI shit-show that I don't like or worry about, but taxing them specifically because they're driving up the price of memory when you want some is not really a "societal cost". You then mention something about "general taxpayers" - didn't you just lobby to make them a super tax payer? Go ahead and rant, but seems like pretty basic supply & demand, and keep some perspective; it's computer memory not bread.
I pre-ordered and picked up a framework desktop with 128GB of DDR5-8000 inside of it. This is the type of system that is the a indirect byproduct of the change towards AI - it may not have been what AM was originally intending with the AI Max 395+ line - but it definitly is the kind of optimized thinking that will drive AI into the hands of consumers.
That's part of the reason I think this boom-bust cycle might be a bit different. Hopefully, Intel can use some of its capacity that they have coming up in the foundry to service this need.