Report: AMD R9 290X Coming in 8 GB Variants

Status
Not open for further replies.

dovah-chan

Honorable
Feb 5, 2014
1,339
0
11,960
236
A very sneaky move by AMD. Adding more VRAM to add more value in order to buy some more time to sell more cards and fight the Maxwell enemy. I'm not sure if Hawaii itself can even utilize an 8GB frame buffer but this surely does add a lot more value to AMD's side again. This especially being the case with game developers putting the smack down on VRAM and gamers wanting to move onto 4K which will result in the obvious need for more.

The 980 isn't miles ahead of the 290X in performance so this is an excellent comeback until the 300 series is completed.
 

realibrad

Distinguished
Mar 29, 2007
295
0
18,790
3
A very sneaky move by AMD. Adding more VRAM to add more value in order to buy some more time to sell more cards and fight the Maxwell enemy. I'm not sure if Hawaii itself can even utilize an 8GB frame buffer but this surely does add a lot more value to AMD's side again. This especially being the case with game developers putting the smack down on VRAM and gamers wanting to move onto 4K which will result in the obvious need for more.

The 980 isn't miles ahead of the 290X in performance so this is an excellent comeback until the 300 series is completed.
The reason for the increased VRAM has nothing to do with needing a larger frame buffer. For some time now, games have been using VRAM more than just a frame buffer. You can see this in games like watchdogs. Even at 4k, you would not need 8Gb of VRAM. It has to do with the fact that VRAM is now being used as storage instead of just RAM. This trend is going to increase, as the consoles used a unified system memory structure and ported games will want to use VRAM.


 

dovah-chan

Honorable
Feb 5, 2014
1,339
0
11,960
236
Oh I get it now. My friend was going about that the other day and I didn't quite understand what he meant until you laid it out. It seems now that with Mantle and such a trend that almost everything is going to be happening on the GPU now. Although I always thought that the reason why they never pushed for using GDDR for storage is because it wasn't as fast as DDR since it's optimized for pushing large amounts of data through the bus. This is unlike DDR where trying to hit the lowest latencies possible with small amounts of RAM being used by many different programs is crucial for a snappier user experience.

I guess the speed difference has become negligible? Or maybe since more computational tasks are occurring on the GPU it wouldn't really matter anyway?

Also I guess when I spoke of large frame buffer requirements I mainly thought of Shadow of Mordor which has a minimum frame buffer requirement of 3GB which makes any card with less than 3GB unusable at 4K.
 

a1r

Reputable
Aug 6, 2014
41
0
4,540
2
Far as I know, even at 8gb textures can still only use 4mb each at most. It's a limit in OpenGL and DirectX and therefore a limitation in hardware. Unless something has changed in the past few generations of APIs.
 

giovanni86

Distinguished
May 10, 2007
466
0
18,790
4
More memory in my book i thought meant that you could play at a higher res. Like i can play titanfall maxed at 1920x1080 at maintain 40-60fps but once i switch to 2560x1440 the game is unplayable. I figured if i had more VRAM on my 580's that i could play it maxed. Just a thought i may be wrong, thats just how i see it. If im wrong please correct me.
 

blppt

Distinguished
Jun 6, 2008
451
5
18,785
0
Never made it to market? It was for sale for a few months before the stock ran out, at OCUK. You could even have it shipped to the US; I know because I was seriously considering buying one.
 

dovah-chan

Honorable
Feb 5, 2014
1,339
0
11,960
236
Well there is just a point to where the GPU can only utilize so much VRAM before the GPU itself is overwhelmed with a heavy load that any extra VRAM is wasted. It's like putting 4GB on a 750 Ti. The card itself isn't fast enough to render games at 4K with an acceptable framerate so therefore it wouldn't even help it to have all that VRAM except to push up sales for the folks that don't know any better.

 

anthony8989

Distinguished
Feb 2, 2013
653
2
19,165
57
Yeah, you're going to need at least 2 R9 290x's in Crossfire to get any use out of that 8GB VRAM on a 4k monitor.

I'm curious to see the results from reviews. If the additional 4GB per card actually translates into large performance increases.

Nvidia seems to be investing more in architectural advancements and larger caches which - taking energy efficiency into consideration - seems to be the winning formula. I have my doubts about simply adding more VRAM to the same board. If I was to speculate: I'd say the only noticeable increase in performance would be strictly at 4k, perhaps 7680 x 1440/1600 triple monitor. But even at that it seems like the GPUs are still the limiting factor...
 

Gaidax

Honorable
Sep 27, 2013
1,024
1
11,660
123
Unnecessary waste, by the time this will be truly needed 290X will be a stone age technology.

The only thing I can see this good for is Tri-Fire plus 4k gaming... and even then, better just wait another half a year for more suitable hardware and monitors come out.
 
nice, however I would like to see a article on 4k gaming benchmarks with these 8GB cards (toms hardware could you do it?). since I'm curious if those gpus can handle 8GB of memory and not be a gimmick like most of the GTX 770 4GB models were.
 

firefoxx04

Distinguished
Jan 23, 2009
1,371
1
19,660
144
People who think this is silly obviously do not understand how ram works in xfire or sli. LOTS of VRAM is necessary when using two cards at high resolution.

My crossfire 6850s run out of vram before the GPUs can be pushed to their full potential but when running just one of them, I run out of GPU before I run out of VRAM.
 

blppt

Distinguished
Jun 6, 2008
451
5
18,785
0
VRAM is starting to get used more now---Watch Dogs needed 3GB+ to run smoothly on Ultra settings @ 1080p. Shadow of Mordor may not actually need all 6GB to run smoothly with Ultra Textures (the HD texture pack), but it definitely needs more than 4---my 290x (4gb) has texture swapping hitches when you put ultra on @ 1080p. My Titan Black, OTOH, with 6GB local, has no issues. I also would not be shocked to learn that GTA5 next year will need a good amount of VRAM for ultra settings.

So yeah, you dont need to be running 4k to worry about VRAM nowadays.
 

SPLWF

Honorable
Nov 5, 2013
8
0
10,510
0
This will solve the problem of running HD textures on Shadow of Mordor. I have a 290, it's borders at 3900-3800mb in vram on Ultra. A little more can help a bit.
 
Aug 6, 2013
326
0
10,810
13
This card is a nice game-changer. It just about shifts my position from heading for a GTX 980 to this card; even if NVidia drivers are easier to install & run most of the time.
The AMD 8Gb cards are are unavailable in the country I live in at the moment though.
Great option....
Desicion.... to have one posted across the globe or wait or keep with the 4Gb which I thought should be 6Gb min.......
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS