Core Clock Graphics Nvidia/ATI

kudu

Distinguished
Oct 3, 2009
13
0
18,510
The new games 2 come will require more speed / Ram. Why not bring out a card the has 1500Mhz Core Clock & 5GB Ram? That should last 5years. The New graphics cards should at least be 3-4years ahead of the new games. I want to get new card now and still play the latest games to come in 4 to 5 years time on full graphics with no problem.

Regards
kudu
 
Because 5GB of GDDR5 VRAM would probably run you about $400 add onto that the heat it would produce and its not that it isnt viable, its that its not physically possible. No PCI-E card can draw more than 400 watts of power, and that much power would roast the other components inside a system. They also dont know what new games will need in the future, the biggest performance increases come from changes to directx, you cant make a DX13 compliant card if no one has even formed a committee to discuss DX13 yet.

Why pay $800 today for a card that will play games in 4 years well when you can spend $100 every year and have a better card in the end and will have only spent $400, companies provide viable solutions to what is needed now, not what might be needed years in the future, even their crystal balls dont know the answer to that.
 

4745454b

Titan
Moderator
Just put out a card with 1500MHz? Gee, why don't we just put out a CPU at 9GHz so this card isn't bottlenecked? I guess the system ram should have what, 30, no 45GB? Lets see, what else should this fantasy system have? Have you figured out yet whats wrong with your idea?
 
Trust me, if they could, they would. However it is nowhere near physically possible. The chips would just breakdown/fry before then. However, there are rumors of some of the new cards OCing to 1GHz clocks, and NVidias new cards have to possibility to have some huge amounts of ram (but that is really for the Tesla cards) so we aren't too far off.

That said, I doubt that even your hypothetical card would last that long. As we saw when 2GHz core 2's clobbered 4 GHz P4's, megahertz isn't everything. Better/more efficient hardware will beat it out in the end.