[citation][nom]newood[/nom]Can't wait for the green team. I think the New 7970 will end up at the 580 level unfortunately. I am probably going to wait for another gen, my current tri sli 480 hydro's are still pumping good fps.[/citation]
I can't wait for Kepler either but if you think the 7970 will only match the 580, then you need to lay off the crack
[citation][nom]intel4eva[/nom]Currently, AMD only wins sales on price. No sane person would buy AMD product, were it premium priced, and even so, Nvidia’s superior software support probably makes their premium worth paying.With the hideous failure of the Bulldozer, AMD is going to have to spend the next year+ being all about sales volumes.[/citation]
What on earth are you talking about? AMD cards are more energy efficient and cheaper on the whole, and AMD drivers aren't anywhere near as bad as the picture you're painting. What's more, Zambezi is in short supply because it's selling so well; it may not be what we hoped for but it's hardly a "hideous" failure.
And while we're at it, both AMD's and NVIDIA's drivers support multiple generations and architectures. AMD have been historically slow at fixing problems but it doesn't mean they aren't trying, and in any case there's plenty of happy people here who don't suffer driver issues. NVIDIA certainly have the better support, but that gap does appear to be closing (and didn't NVIDIA release a card-killing driver set a year or so back...?).
[citation][nom]DSpider[/nom]I think it's a reference from Futurama... From the iPhone parody episode.[/citation]
[citation][nom]malmental[/nom]http://www.tomshardware.com/forum/ [...] nical-data[/citation]
I see a couple of flaws in the article, at least the Tom's version:
- As others have mentioned, I'm PRETTY SURE the card didn't release back in 2010.
- Moreover, if you intend to hit 240-264 GB/sec of memory bandwidth on a 384-bit memory interface, you need an effective clock rate of 5000-5500 MT/s. (1250-1375 MHz for GDDR5) a 1 GHz memory clock (4 MT/s effective when using GDDR5) means only 192 GB/sec of bandwidth.
At any rate, I want something more in the details of precisely how "GCN" will be different from the existing "VLIW4" architecture. The mention of GPGPU focus makes me hope that, for once, the cards will support full dual-precision compatibility, which would mean their DPFP performance would be a full 50% their single-precision performance, vs. only 25% for VLIW4, or 20% for VLIW5.
As for using GDDR5, I'm entirely unsurprised. Using XDR2 *AND* a >256-bit memory interface would've been overkill; it would've yielded bandwidth that'd see little to no use, while taking the costs of both. This is why, for instance, when the Radeon HD 2900 came out, it only used GDDR3 for its 512-bit memory interface, and not GDDR4; but the 3870, which dropped back to 256-bit, went to GDDR4 again.
As a general rule, if you want to upgrade a step, you can upgrade the RAM type, *OR* make the interface wider, but don't do both. Remember that either upgrade costs money to implement, and eventually your bandwidth gets bottlenecked by the GPU. The goal is to achieve the best balance that gives the best performance for its production cost.
[citation][nom]dsa43f2[/nom]Nvidia and amd needs to make the next gen card PCI-E 3.0 ONLY. if you arent ready for it then you can get the current gen. PCI-e 3 if fully utilized can make a huge difference, this can improve graphics, think of the desginers who has to create huge art work, this can make it so much faster.[/citation]
AMD doesn't even have any PCI-e 3.0 motherboards out, so why would they make a video card that wouldn't work with anything they make?
Also, you demonstrate a fundamental lack of understanding of what's there; PCI-e 3.0 would not magically allow for "better graphics." The two main benefits are that it'd boost power delivery, and improve bandwidth. Currently, PCI-e 2.0 is so potent that a 16-lane slot is plenty enough for even top-end cards: they'd only start to have an issue if you had a cheaper motherboard that split you to 2x8-lane slots for SLi/CrossFire; and if you're spending money on two expensive cards you should be spending the money on a suitable motherboard, too.