News Multi-million dollar Cheyenne supercomputer auction ends with $480,085 bid — buyer walked away with 8,064 Intel Xeon Broadwell CPUs, 313TB DDR4-240...

Status
Not open for further replies.
What use a buyer might have for a malfunctioning and outdated supercomputer array is known only to them, but if you see a flood of Xeon chips hitting the market, you'll have a pretty good idea why.

Well, there are probably several hundred universities, many with government or military contracts, with computers far slower than Cheyenne who could be interested in it, not to mention any number of millionaries and billionaires who would love to donate it to a university or organization, or in parts to several, for tax purposes, with the refurbishment costs likely being far lower than buying a new system.
 
  • Like
Reactions: 35below0
Well, there are probably several hundred universities, many with government or military contracts, with computers far slower than Cheyenne who could be interested in it, not to mention any number of millionaries and billionaires who would love to donate it to a university or organization, or in parts to several, for tax purposes, with the refurbishment costs likely being far lower than buying a new system.
Since it's an outdated hardware, it may be sent to Russia via third countries. Some Russian Companies need many hardware spare parts to fix their data centers.
 
They could buy one of these and have 1,000+ times more compute power. Payback time on the loss on this B-52, relatively speaking, would be pretty darn quick.

https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing

The government could also just lease space on AWS. No worries about cost of upgrades, maintenance, or leaky pipes.

It’s noteworthy that NVIDIA and AMD to a lesser extent are crushing Intel. We’re at the start of a massive data center refresh cycle to power Gen AI and on-site compute moving to the cloud. That’s NVIDIA. We’ll start seeing a lot more of these and at prices heading down.
 
Last edited:
As I understand it, it may cost as much to move as it did to buy it, and the monthly costs in electricity are something like $90K. Lots of big heavy cabinets, special room requirements, and you're getting a bunch of ~6th gen intel cpus.
 
a white elephant - lucky to break even I've being doing the same thing on a much less scale for years.. always lost money.
 
They could buy one of these and have 1,000+ times more compute power. Payback time on the loss on this B-52, relatively speaking, would be pretty darn quick.

https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing

The government could also just lease space on AWS. No worries about cost of upgrades, maintenance, or leaky pipes.

It’s noteworthy that NVIDIA and AMD to a lesser extent are crushing Intel. We’re at the start of a massive data center refresh cycle to power Gen AI and on-site compute moving to the cloud. That’s NVIDIA. We’ll start seeing a lot more of these and at prices heading down.
Proably for the government. But maybe its a third world country trying to keep even.
 
Yea nothing like retiring a machine like that.... I have seen warehouses with machines that when built were 7 figures.. only thing they have value as is scrap... too much $$$ to run for the computing power...
 
I can't see anyone buying this to have their own 5.34-petaflop system. A brand new system with that much processing power based on Zen4 or Zen4c processors would be a fraction the size and wouldn't require it's own small power plant to operate.
Parting it out makes the most sense. Either individual components, or complete nodes have value. I recently bought a couple of Broadwell CPUs on eBay to do a final upgrade to a Haswell system that will hopefully last another year or two before it no longer serves my needs. Hundreds of others could do the same if this system was cannibalized and sold off. The only thing really noteworthy about this surplus sale is that it all belonged to the same system, rather than just being a couple of shipping containers full of Broadwell based server racks.
The buyer could probably get a sizeable tax write-off to donate the Cheyenne facade to a museum, or sell it to a tech company to install in their lobby.
 
a white elephant - lucky to break even I've being doing the same thing on a much less scale for years.. always lost money.
LeaseWeb or OVH could buy these up and lease them out for another decade or so - OVH already has water-cooling experience and a bunch of racks in Quebec, just load the CPUs, DIMMs and MBs on a Penske truck and dump the rest as noted above.
 
Gotta be for the parts. Good luck trying to interface with proprietary, leaky water connectors. Those would all to be replaced. Forget about converting to air cooling. The cost of engineering and manufacturing the cooling parts for that conversion would be nuts.
 
This is a dumb take:

It isn't easy to provide an analysis of how much the government stands to lose with the sale of Cheyenne

Taxpayers got an extra two years out of this machine by cobbling parts together (due to covid) which extended it's planned lifespan. That's like saying the government lost money when it left the Opportunity rover on Mars
 
  • Like
Reactions: stoatwblr
Still, a savvy eBay seller could flip the processors and RAM across the machines for around $700,000 (£550,000), making a hefty profit.
That seems like some rather optimistic math. Keep in mind, they would have various additional fees to take care of on eBay, including listing fees and shipping. Even assuming they managed to get the sale prices listed in the article, the ebay fees alone would likely cut that value down to around 600,000, before even getting into shipping, and presumably paying someone to package up and ship out these orders, possibly test the hardware, and deal with disputes. Selling over 8000 CPUs and close to 5,000 sticks of RAM in small quantities might be a rather costly undertaking.

And of course, if you flood the market with thousands of identical pieces of used hardware, the value of that hardware is likely to drop. I can't imagine they would want to hold onto this already-dated hardware for longer than necessary, so they would likely need to sell it for lower than the current market price to get rid of it within a reasonable length of time. There's a reason no one spent more than that for the hardware. If it was as easy as tossing it on eBay and raking in hundreds of thousands of dollars, someone would have likely paid more for it.
 
"It isn't easy to provide an analysis of how much the government stands to lose with the sale of Cheyenne."

Zero. It outlasted expectations

Whatever is spent on the system upfront is a sunk cost and at the end of life, it just has scrap value for the bones, which helps offset disposal costs

The accounting lifecycle of the equipment is five years - that's how long the hardware support contracts last and the point where failure rates haven't started ramping up. A 1% failure rate at 8 years with those kinds of thermal loads is impressive

Anything past "vendor supported lifespan" is a bonus (potentially a liability, depending on your accounting structure). Cheyenne is eight years old and it's arguable that the electrical costs of running it (both the racks and the cooling system) are enough to justify replacing it with newer kit

$/FLOP isn't just a purchase item, but also a running cost calculation

The old mare has been belatedly put out to pasture and the new filly should have been in place 2 years ago,

That new system will likely be able to perform 5-10x the workload for the same power draw (if not vastly more). Things have changed a _lot_ in the last 8 years, particularly in the kind of processing that's needed for weather/climate modelling (GPUs lend themselves particularly well to this kind of task)
 
That seems like some rather optimistic math.
To be honest, $480k probably doesn't even cover the administrative costs of putting the thing up for auction, whilst simultaneously leaves almost no margin for onselling.

In all liklihood it would have been cheaper to just haul it off to the local recycling centre and pay disposal fees

As Cryo says, the sheer volume of 9-10yo stuff appearing in channels is going to have a marked effect on S/H pricing, probably depressing it by 10-20% and there's a diminishing window of opportunity to actually offload it. My home NAS is based around this generation of stuff (workplace castoffs) and I'm looking to get rid of it due to the excessive power draw compared to newer kit.

I'd be a perfect candidate for dropping the CPUs or extra ram in my system, but for the going rate of those things it's cheaper to put the money into a newer (old) cast-off system. Nobody in business will buy it because the TOC doesn't make sense and the CPUs would make extremely expensive keyring tags

The real value is in the rack hardware, but even then it's marginal as a business case (which is why you can get old server racks essentially for free most of the time)
 
Gotta be for the parts. Good luck trying to interface with proprietary, leaky water connectors.
Most of these kinds of systems have watercooled racks (rear door coolers), not plumbed in servers (the photos I can see all show cooled doors and aircooled systems) - for precisely the reason that plumbing always leaks sooner or later and you really don't want that water inside a computer case

There are ways of preventing that(*) however if they'd done it in the first place, the leaky connectors wouldn't be an issue as they'd simply be sucking a little air rather than dripping/spraying water in the server room, with all the attendant mould/fungus issues that come with that kind of event (it's more of an issue than the hardware getting damp, as the hardware can handle a little moisture, but mould has a tendency to make people _extremely_ ill over long exposure periods, especially in an environment with high levels of air airculation keeping spores airborne)

(*) On my last server room before retiring, we spent £250k on a 100kW cooling system. Going "leakproof" added about £25k to the final cost and involves interposing an isolated loop into the server room which is backed by a vacuum pump - you can literally crack open a 4 inch feeder valve and the water stays inside the pipe whilst lots of air gets sucked in. Even if the vac pump fails, the isolated loop minimises the worst case amount of water dumped to a few tens/hundreds of litres vs potentially a few thousand litres. Manglement griped until I asked them how much having a week's downtime due to a flood would cost, or having workers comp claims due to mould issues in the event of a slow leak.
 
Well, there are probably several hundred universities, many with government or military contracts, with computers far slower than Cheyenne who could be interested in it
Nope. not even with somebody else's ten foot bargepole. The hardware is well past its use by date, draws too much power (as a system or as individual servers) and has entered the otherside of the bathtub failure curve.

Even if it was free, it would be too expensive. Staff costs keeping it running would see to that

There's a small market for businesses with geriatric systems they _have_ to keep running for contractual reasons(*) and hobbyists, that's about it. 7+ year old kit is essentially worthless

(*) I had a sparcstaion 5 sitting in one rack for 20+ years because one piece of software was keyed to it and that supported an interplanetary probe launched a very long time ago. One day it failed to power up after a scheduled outage - "and there was much rejoicing" as the owners realised they could run the software in an emulator instead ("Oh, but it's too haaaard!")
 
Status
Not open for further replies.