News TinyBox packs a punch with six of AMD's fastest gaming GPUs repurposed for AI — new box uses Radeon 7900 XTX and retails for $15K, now in production

Status
Not open for further replies.

ace6558966

Distinguished
May 4, 2013
2
0
18,510
And the award for least density per ru goes to.... but seriously, why in the world does this solution necessitate 12u (expansion plans???)? 6gpu, an epyc Rome CPU,128gb of ram,5 SSDs, ocp port, and what?
 

USAFRet

Titan
Moderator
Very true. 6 7900xtx's as well as a CPU capable of running them all isn't cheap after all. A man can dream, though
I meant just the game code.

To take advantage of that hardware, it would need a whole new custom version.

There would be a global market of maybe 25 people.

Salaries of a dozen people on the dev team, divided by 25...
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
I meant just the game code.

To take advantage of that hardware, it would need a whole new custom version.

There would be a global market of maybe 25 people.

Salaries of a dozen people on the dev team, divided by 25...
Yeah, that's a great point. Maybe more than 25 people, though. I mean there are thousands of people that can afford a shiny new flagship GPU for $1000+ every 2 years. Who's to say that those same people couldn't save up their money for a few years and buy the hardware as well as the game? Is it likely or even feasible, no of course not, but it could happen.
 

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
And monkeys could fly outta my butt.
Yeah, I suppose you could argue that anything could happen with that logic. :rofl:
on a more serious note, I don't understand how anything besides extremely high resolution gaming or rendering could use anywhere close to 144GB of VRAM. I know that the vague response to my query will be "AI models", Ok sure, but what exactly makes them so VRAM hungry?
 

USAFRet

Titan
Moderator
on a more serious note, I don't understand how anything besides extremely high resolution gaming or rendering could use anywhere close to 144GB of VRAM. I know that the vague response to my query will be "AI models", Ok sure, but what exactly makes them so VRAM hungry?
A LOT of calculations, that need to happen really really fast.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
I'm not an AI person, but...

Capturing the semantics of what was written, retrieving lots of data from the large model, comparing that to the captured data, seeing if it fits, redo that several thousand times....in a second or two.
Thanks. That explains a lot.
And by "lots of data"...think of "a large percentage of the internet"
When you put things in a different perspective, that helps a lot.
 
Status
Not open for further replies.