Nvidia GeForce 9800 GT And ATI Radeon 4870 Get A 225 Watt TDP

Status
Not open for further replies.
G

Guest

Guest
4870 is not supposed to compete with the GTX 200 series... at most it will try to compete with the GTX 260... but the GTX 280 will crush it... the 4870 X2 I think will beat the GTX 280, I thought differently... but... Idk... it just seems ATI is taking a step in the right direction while NVIDIA is still concentrating on one massive killer chip... when 2 smaller medium chips would win... but w/e
 

terror112

Distinguished
Dec 25, 2006
484
0
18,780
But that will only happen if the game supports SLI, or Crossfire. But since most new games do now, that problem has been ruled out. I just cant wait for the upcoming series.. it's been too long..
 

fulle

Distinguished
May 31, 2008
968
0
19,010
I'm too paranoid about micro stutter to really want SLI or Crossfire. While SLI/Crossfire may have higher frame rates, this may not be perceived by the eye to be more fluid... and at the end of the day, that's what matters.

More on subject, I really hope these cards aren't requiring that sort of wattage to run at stock speeds. Sounds hot, and unnecessary on a 55nm midrange GPU.
 

LAN_deRf_HA

Distinguished
Nov 24, 2006
492
0
18,780
Ati claims to have fixed the micro stuttering issue. With the price, performance, and dx10.1 thing (which we've seen does improve performance) its hard for me to see the point in owning one of nvidia's new cards. Sadly I know people will still buy more of their cards due to how long ati would have to be on top for the news to filter down to the average buyer, who do not research their products.
 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
I long for the days when a product like PowerVR2 could come out with a good price point, and high efficiency (due to the tiling architecture), with relatively low power consumption. Too bad they were never able to design a hardware T&L engine to work with the PowerVR architecture.

Pretty soon we are all going to need small nuclear reactors in our cases to power these "brute force" wonders from ATI and nVidia. What happened to design elegance?
 

fulle

Distinguished
May 31, 2008
968
0
19,010
@blppt
Interesting someone brought up PowerVR2... I actually owned a KyroII back in the day... and a Dreamcast. Good concepts there, like not "wasting fillrate and texture bandwidth on pixels that are not visible in the final image." TBDR, and HSR were good ideas that squeezed more performance out of otherwise inferior hardware.

I wish Nvidia and ATI/AMD would try to do things more efficiently, rather than try to figure out how to feed their beasts more watts. I suppose this is still great news to overclockers though.
 

hellwig

Distinguished
May 29, 2008
1,743
0
19,860
I'm not up to date on motherboards and PCIe, but how do a motherboard and PCIe slot supply 75W individually to a card plugged directly into the PCIe slot (which resides on the motherboard). I was pretty sure the 75W PCIe was the total supplied by the board. If the board itself supplies an additional 75W, where does this come from? How does this 75W get to the board if not through the PCIe slot (which is apparently already suppying its own 75w)?
 
GDDR5 and GDDR3 - all comes down to power consumption, max clock speeds of memory etc, not raw performance, altho the higher clock speed benefit will give the gpu more bandwidth and can lead to more performance etc but that alone will do nothing.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

A PCIe 2.0 slot can supply up to 150W.
 
Status
Not open for further replies.