News AMD Radeon RX 7900 XT Board Design Revealed: 24GB and 450W

BILL1957

Commendable
Sep 8, 2020
59
17
1,535
0
Nvidia has always been crucified for their high power draw numbers many claimed they would not buy Nvidia for that reason and now AMD joins them .
Will there be same outcry from the AMD fans now they are in the same situation?
 
Reactions: KyaraM

Eximo

Titan
Ambassador
Um, triple 8-pin isn't exactly new to AMD...6900XT and 6950XT both have versions with that. Not to mention similar power profiles to the 3080 and 3080Ti.

Fury X was pushing 275W, the Fury Duo clocked in at 350W, and the R9-295X2 was a whopping 500W. 8990 was 375W....

Nvidia wasn't any better, they had plenty of 250W plus cards and their dual GPUs could demand 375W as well.
 

JamesJones44

Prominent
Jan 22, 2021
224
144
760
0
Nvidia has always been crucified for their high power draw numbers many claimed they would not buy Nvidia for that reason and now AMD joins them .
Will there be same outcry from the AMD fans now they are in the same situation?
Based on the comments about what's happening with the 7000 series CPUs, I doubt it. The side stepping and twisting to fit fanboyisms on all sides (AMD, Intel and Nvidia) is always a popcorn worthy event though.
 
Reactions: KyaraM

blacknemesist

Distinguished
Oct 18, 2012
419
49
18,840
19
Good to see there will be alternatives to the 12 pin BS. Some of us already have good expensive and reliable PSU, dropping another sack of money for something you will only use for the GPU(considering CPU and Mobo prices for AM5, most gamers do not even need them and are extremely expensive) is just a plain bad choice, specially because in as shorts as a couple of months affordable PSU and other "new" components will drop.

The power draw is inevitably going to go higher and higher, thinking otherwise when technology did not make a 180º on the architecture is just wishful thinking, in fact, with ATX 3.0 we are going the opposite way by delivering 600W+ in less than 10ms to a GPU otherwise the system crashes.
 
Reactions: KyaraM

BILL1957

Commendable
Sep 8, 2020
59
17
1,535
0
Good to see there will be alternatives to the 12 pin BS. Some of us already have good expensive and reliable PSU, dropping another sack of money for something you will only use for the GPU(considering CPU and Mobo prices for AM5, most gamers do not even need them and are extremely expensive) is just a plain bad choice, specially because in as shorts as a couple of months affordable PSU and other "new" components will drop.

The power draw is inevitably going to go higher and higher, thinking otherwise when technology did not make a 180º on the architecture is just wishful thinking, in fact, with ATX 3.0 we are going the opposite way by delivering 600W+ in less than 10ms to a GPU otherwise the system crashes.
When I did my most recent Alder Lake build I oversized the PSU from what I thought I "needed" and made sure it had ample amps on the 12v output and had 3 separate available 8 pin connectors for the plugging in the 12-pin adapter for the GPU.
I also bought a solid well respected name brand and spent a little more for a platinum unit over the gold model more because these models are supposed to generally use a little better grade of components in the PSU build which contribute to the better efficiency ratings.

I was originally waiting for the 4080 to drop but with the recent release of now there will be a lower performing 12g 4080 which of course will be the cheapest model, questions concerning release dates and initial product availability and of course pricing being unknown last week I got off the GPU merry go round and purchased one of the 3090ti cards at their new price point.

Now with yesterday's reveal of EVGA not continuing in the GPU market I think initial 40 series product availability may be affected because at least here in the U.S.A. reports are saying EVGA cards represent about 40%of the Nvidia cards sold in North America.

It will be interesting going forward.
My computer upgrades are done for at least the next two -three years so it does feel better just being a spectator now.
 
The power draw is inevitably going to go higher and higher, thinking otherwise when technology did not make a 180º on the architecture is just wishful thinking,
There is no 180 to be had here, there is no magical way to make things go faster without using more power.
If you want a 180 YOU will have to make it, don't wait for the companies to do it for you.
Get a 1080p card instead of a 4k card and you will be using far less power.
Upscaling has become much better so use it, if you want the max ultra settings at 4k going high power draw is the only way there is.
 
Reactions: KyaraM

blacknemesist

Distinguished
Oct 18, 2012
419
49
18,840
19
There is no 180 to be had here, there is no magical way to make things go faster without using more power.
If you want a 180 YOU will have to make it, don't wait for the companies to do it for you.
Get a 1080p card instead of a 4k card and you will be using far less power.
Upscaling has become much better so use it, if you want the max ultra settings at 4k going high power draw is the only way there is.
There is a thing called efficiency which makes something that 20 years ago required 500W now use 100W and be better. TVs are a good example, Plasma TVs were power sucking monsters compared to LED. That won't happen in GPU or CPUs until there is no way to cool the cards/system, that's when they will have to start using a different approach to make them efficient. The predicted jump in power and required power responsive demands that only ATX v3 can provide do not sound even a little bit efficient if the leaked specs are to be taken as a possibility.
 
There is a thing called efficiency which makes something that 20 years ago required 500W now use 100W and be better. TVs are a good example, Plasma TVs were power sucking monsters compared to LED. That won't happen in GPU or CPUs until there is no way to cool the cards/system, that's when they will have to start using a different approach to make them efficient. The predicted jump in power and required power responsive demands that only ATX v3 can provide do not sound even a little bit efficient if the leaked specs are to be taken as a possibility.
Plasma was a different hardware than led, so sure if technology ever moves on from silicate to something better we will get better efficiency.
Otherwise if you have to compute x number of pixels there are no tricks that you can do you have to compute them all.
 

jp7189

Distinguished
Feb 21, 2012
164
60
18,660
0
Still no info about actual power usage. 3x 8 pin just tells us the power is >300w and <450w (plus PCIe power). I'll wait to see power draw in actual gameplay before I start hoisting my torch and pitchfork.
 
Reactions: TJ Hooker

blacknemesist

Distinguished
Oct 18, 2012
419
49
18,840
19
Still no info about actual power usage. 3x 8 pin just tells us the power is >300w and <450w (plus PCIe power). I'll wait to see power draw in actual gameplay before I start hoisting my torch and pitchfork.
Thats pretty much what we know but remember that partners might decide to push the cards further and force the damn 12v connector and ATX 3.0 anyway.
This year is a nightmare due to changing anything requires a new system and to be honest the heat predicted from the new CPU and GPUs is going to another money sink unless you already rock custom WC on a case that can handle another 360 rad.
I just don't see how a room without AC will be confortable while having 30º room temp in the winter.
 
Thats pretty much what we know but remember that partners might decide to push the cards further and force the damn 12v connector and ATX 3.0 anyway.
This year is a nightmare due to changing anything requires a new system and to be honest the heat predicted from the new CPU and GPUs is going to another money sink unless you already rock custom WC on a case that can handle another 360 rad.
I just don't see how a room without AC will be confortable while having 30º room temp in the winter.
It's not like anything changed in the PC world that forces you to use the most cores at the highest clocks...you can just use the CPU you need instead of going after the biggest one.
If somebody needs that much power they know how to deal with it, everybody else doesn't need to, they would do it out of hobby or out of e-peen or just because.
 
Sep 18, 2022
1
0
10
0
It's worth noting with regards to power, that Igor's labs article states: "However, this also limits the maximum board power of the card to 450 watts, whereby the actual TBP should be far below that" (via Google Translate). So anyone claiming that AMD is going to run these rediculously hot should probably wait until more details come about. They could, but current info is that they're not expected to.

That said, it's also worth noting that Moore's Law is Dead has stated that their sources think the reference cards will be at around 350W, and AIBs might crank the power to the upper limits with crazy overclocks.
 
It's worth noting with regards to power, that Igor's labs article states: "However, this also limits the maximum board power of the card to 450 watts, whereby the actual TBP should be far below that" (via Google Translate). So anyone claiming that AMD is going to run these rediculously hot should probably wait until more details come about. They could, but current info is that they're not expected to.

That said, it's also worth noting that Moore's Law is Dead has stated that their sources think the reference cards will be at around 350W, and AIBs might crank the power to the upper limits with crazy overclocks.
For the GPUs the problem isn't the constant power draw but the peaks since it causes PSUs to shut down if they exceed a certain threshold.
Just look for gpus transients on google.
 
Reactions: KyaraM

blacknemesist

Distinguished
Oct 18, 2012
419
49
18,840
19
It's not like anything changed in the PC world that forces you to use the most cores at the highest clocks...you can just use the CPU you need instead of going after the biggest one.
If somebody needs that much power they know how to deal with it, everybody else doesn't need to, they would do it out of hobby or out of e-peen or just because.
Except when something breaks and you have no choice to buy overpriced 2yo+ hardware instead of spending a bit more for a small refresh. No ATX 3.0 would mean you could just replace a small part of your system to accommodate the new GPU, with ATX 3.0 you just need a new system.
 

martinch

Distinguished
Mar 21, 2014
836
26
19,390
121
Will there be same outcry from the AMD fans now they are in the same situation?
As someone who has a preference for AMD due to various reasons, yes, 450W of power draw is too much. Personally, I have no interest in a GPU that draws more than ~200W of power (or a CPU which draws more than ~100W), irrespective of who makes it. :) (there again, I'd like to think I've been evenly critical of all companies when it's been deserved, irrespective of my own personal biases)
 

TJ Hooker

Champion
Ambassador
Don't you need the extra PCIE 5.0 power from the board so it runs stable? That means new CPU, memory and PSU, that is almost the price of a new rig. If AMD really goes 3 x 8-pin they win just by not having the hassle
To support a graphics card that uses the new 12VHPWR connector, at most you'd just need a new PSU. Maybe not even that, if you can get a 12VHPWR adapter (although you'd want to be sure to get one from a reputable company, not sure if those exist yet though). No reason why you'd have to replace CPU and memory at the same time.

As far as handling the larger transient power spikes of newer graphics cards, that's an issue regardless of which power connector they use (12VHPWR or multiple 8/6 pin). But if you have a high quality PSU with decent power margin, you may be OK even if it's not an ATX 3.0 PSU.
 

spongiemaster

Reputable
Dec 12, 2019
2,119
1,150
4,560
0
To support a graphics card that uses the new 12VHPWR connector, at most you'd just need a new PSU. Maybe not even that, if you can get a 12VHPWR adapter (although you'd want to be sure to get one from a reputable company, not sure if those exist yet though). No reason why you'd have to replace CPU and memory at the same time.

As far as handling the larger transient power spikes of newer graphics cards, that's an issue regardless of which power connector they use (12VHPWR or multiple 8/6 pin). But if you have a high quality PSU with decent power margin, you may be OK even if it's not an ATX 3.0 PSU.
Any card that requires a 12VHPWR connector will have a decent adapter in the box.
 
Reactions: TJ Hooker

ASK THE COMMUNITY

TRENDING THREADS