Data centers last decades. Many or the current AI hosting vendors such as Coreweave have crypto origin. Their data centers were built out in 2010s, early 2020s.
Many of legacy systems still running today are IBM or Solaris servers at 20, 30 year old. No reason to believe GPU won’t still be in use in some capacity (e.g. interference) a decade from now.
Skeletons of data centers and some components (i.e. cooling) have long shelf life, but they're also ~10% of investment. Plurality of fiber and rail went towards building out linear infrastructure where improvements can be milked at nodes to improve network efficiency (better switches etc).
VS plurality of AI investment, i.e. trillions are going towards fast deprecating components where we can say with relative confidence will likely be net negative stranded assets in terms of amoritization costs if current semi manufacturing trends continues.
Keeping some mission critical legacy systems around is different than having trillions that makes no financial sense to keep on the books, i.e. post bubble new gen hardware will likely not have scarcity pricing or better compute efficiency (better apex and opex), there is no reason to believe companies will legacy GPUs around at scale if every rack loses them money relative to new hardware. And depending on actual commercialization compute demand, it can simply make more economic sense to retire them than keep them going.
Many of legacy systems still running today are IBM or Solaris servers at 20, 30 year old. No reason to believe GPU won’t still be in use in some capacity (e.g. interference) a decade from now.