Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And yet we're talking about electronics here, they don't have sentimental value and just because compute capacity is unused there are no guarantees that it will be used, even at a per unit cost approaching €0.

I'm sure that farmers during the Great Depression were also consoling themselves with the "intrisinsic caloric value" of their corn.



As I said, the intrinsic value of a GPU is not measured in €. In fact, the lower the sale price gets, the better a deal it is, not worse - you get the same intrinsic value for less extrinsic cost.

There are also intrinsic costs, mostly power consumption.


Demand for GPU power is much more elastic than demand for food calories.


Food calories are cheaper to convert into something useful. It's not like GPUs, once bought for peanuts, turn into perpetual motion machines. They need power, cooling, a whole infrastructure built around them.

GPUs would have taken the world by storm already in the roughly 30 years since they've been around.

Even for GenAI it's likely ASICs take over at some point if we really care about performance.


GPUs have taken the world by storm. There's one in almost every computer, and they make up the bulk of supercomputers!

If you put a 75% discount on these powerful GPUs there will be a long line of non-AI-company purchasers.


GPUs used to cost 20% of what they cost know and Intel and AMD make perfectly serviceable GPUs for most PCs. NVIDIA top of the line GPUs won't suddenly be plugged in to lowly laptops.

Yes, lots of companies will buy them for cheap, but these AI beasts also have OpEx costs. Not every alternative use is worth the money and there are 0 guarantees that the alternative costs cover the gap. NVIDIA sell 80% of GPUs for AI now.

I think people don't realize just how big this bubble is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: