Report: AMD Radeon HD 7900 Series to Launch January 9

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

newood

Distinguished
Mar 30, 2009
12
2
18,515
0
Can't wait for the green team. I think the New 7970 will end up at the 580 level unfortunately. I am probably going to wait for another gen, my current tri sli 480 hydro's are still pumping good fps.
 

slabbo

Distinguished
Feb 11, 2009
457
0
18,780
0
time to start saving up to replace my good old radeon 4850. I loved that card, it still runs fine with everything I throw at it, and it's still going strong, but a 7900's card is so tempting.
 

stonedatheist

Distinguished
May 13, 2010
97
0
18,630
0
[citation][nom]newood[/nom]Can't wait for the green team. I think the New 7970 will end up at the 580 level unfortunately. I am probably going to wait for another gen, my current tri sli 480 hydro's are still pumping good fps.[/citation]

I can't wait for Kepler either but if you think the 7970 will only match the 580, then you need to lay off the crack
 

Ninja Pants

Distinguished
Jul 10, 2011
303
0
18,860
25
$700US that's 6990 territory (in Latvia anyway), I know they're single GPU cards but will they match the performance of a 6990 for dollar value?

And +1 for a4mula, we can't even saturate 2.1 so why limit your market by going PCIe 3.0 only when there is no reason to.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
0
[citation][nom]intel4eva[/nom]Currently, AMD only wins sales on price. No sane person would buy AMD product, were it premium priced, and even so, Nvidia’s superior software support probably makes their premium worth paying.With the hideous failure of the Bulldozer, AMD is going to have to spend the next year+ being all about sales volumes.[/citation]

What on earth are you talking about? AMD cards are more energy efficient and cheaper on the whole, and AMD drivers aren't anywhere near as bad as the picture you're painting. What's more, Zambezi is in short supply because it's selling so well; it may not be what we hoped for but it's hardly a "hideous" failure.

And while we're at it, both AMD's and NVIDIA's drivers support multiple generations and architectures. AMD have been historically slow at fixing problems but it doesn't mean they aren't trying, and in any case there's plenty of happy people here who don't suffer driver issues. NVIDIA certainly have the better support, but that gap does appear to be closing (and didn't NVIDIA release a card-killing driver set a year or so back...?).
 

JonnyDough

Distinguished
Feb 24, 2007
2,234
2
19,865
29
[citation][nom]DSpider[/nom]I think it's a reference from Futurama... From the iPhone parody episode.[/citation]
[citation][nom]malmental[/nom]http://www.tomshardware.com/forum/ [...] nical-data[/citation]

Futurama is so "in the past" yo.
 

iLLz

Distinguished
Nov 14, 2003
102
0
18,680
0
Those specs sound awesome. Here's to hoping that both AMD and nVidia cards double performance of the current gen.
 

flowingbass

Distinguished
Oct 28, 2010
151
0
18,690
3
Has anybody noticed that they keep flaunting the new "1D architecture"?

Nvidia currently uses a 1D architecture thats why back in 5000/6000 series you cannot compare nvidia cores to amd cores. Its like apples and oranges.

Now that AMD has moved to 1D architecture which is the same as Nvidia. Its gonna be apples to apples now. So it looks like were looking at 2048 "1D cores or comparatively to cuda cores".

Maybe thats what justifies the 700$ price mark.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
16
I see a couple of flaws in the article, at least the Tom's version:

- As others have mentioned, I'm PRETTY SURE the card didn't release back in 2010.
- Moreover, if you intend to hit 240-264 GB/sec of memory bandwidth on a 384-bit memory interface, you need an effective clock rate of 5000-5500 MT/s. (1250-1375 MHz for GDDR5) a 1 GHz memory clock (4 MT/s effective when using GDDR5) means only 192 GB/sec of bandwidth.

At any rate, I want something more in the details of precisely how "GCN" will be different from the existing "VLIW4" architecture. The mention of GPGPU focus makes me hope that, for once, the cards will support full dual-precision compatibility, which would mean their DPFP performance would be a full 50% their single-precision performance, vs. only 25% for VLIW4, or 20% for VLIW5.

As for using GDDR5, I'm entirely unsurprised. Using XDR2 *AND* a >256-bit memory interface would've been overkill; it would've yielded bandwidth that'd see little to no use, while taking the costs of both. This is why, for instance, when the Radeon HD 2900 came out, it only used GDDR3 for its 512-bit memory interface, and not GDDR4; but the 3870, which dropped back to 256-bit, went to GDDR4 again.

As a general rule, if you want to upgrade a step, you can upgrade the RAM type, *OR* make the interface wider, but don't do both. Remember that either upgrade costs money to implement, and eventually your bandwidth gets bottlenecked by the GPU. The goal is to achieve the best balance that gives the best performance for its production cost.
[citation][nom]dsa43f2[/nom]Nvidia and amd needs to make the next gen card PCI-E 3.0 ONLY. if you arent ready for it then you can get the current gen. PCI-e 3 if fully utilized can make a huge difference, this can improve graphics, think of the desginers who has to create huge art work, this can make it so much faster.[/citation]
AMD doesn't even have any PCI-e 3.0 motherboards out, so why would they make a video card that wouldn't work with anything they make?

Also, you demonstrate a fundamental lack of understanding of what's there; PCI-e 3.0 would not magically allow for "better graphics." The two main benefits are that it'd boost power delivery, and improve bandwidth. Currently, PCI-e 2.0 is so potent that a 16-lane slot is plenty enough for even top-end cards: they'd only start to have an issue if you had a cheaper motherboard that split you to 2x8-lane slots for SLi/CrossFire; and if you're spending money on two expensive cards you should be spending the money on a suitable motherboard, too.
 
Status
Not open for further replies.

ASK THE COMMUNITY