Radeon R9 295X2 In CrossFire: 25 Billion Transistors Game At 4K

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

siman0

Distinguished
Jan 16, 2011
89
0
18,630
This should have waited for a few driver updates. Only thing this proves is Toms has 2 295x. With AMD's new crossfire methods it should be much better than before. ONly thing is drivers are extremely young for this card. TBH with beta drivers Im surprised it ran at all.
 

ShawnT007

Honorable
Nov 16, 2013
16
0
10,510
Chris, I am running 2 290x at 4k, and i kept getting stuttering on pretty much every game I played. I discovered that it is being caused by ATIs powertune down throttling my clock speeds, I figured out if you overclock through CCC just a little bit that the stuttering goes away. I did 10% increases on both settings (power limit & GPU Clock speed), set fan speeds to 100%, and set memory Clock to 1350 and presto my stuttering was gone! Hope this helps.
 

nyran125tk

Reputable
May 5, 2014
1
0
4,510
If anyone (Even if you are super rich) thinks they need 2 of these things is insane. But i guess its fun to benchmark. Theres nothing out there that requires this kind of power (and there should never be anything out there that requires this kind of power,ever. Unless the game developers want to see on a few 100 people on the planet playing their games). Also Crysis in 2006 couldnt run 8800's at all, it ran at 15fps with the best cards available on release day, on DX10. Sometimes games are so far ahead of the hardware that you simply just have to wait for the right hardware to come out. IF you buy 2 of these , in 3 years games will have some new graphics thing in it, that even the 295's dont have. Such a waste of money, getting 2 of these. But if your completely loaded and completely bored and just want to show stuff to your friends.. I guess you could justify it for the PR.
 

firefoxx04

Distinguished
Jan 23, 2009
1,371
1
19,660
To the guy wanting to see a 6-12 core xeon instead of the 6 core i7 is just silly. Unless the game is using a bunch of threads, the xeon wont help. A high clocked i7 will be better than a slower xeon with a bunch of threads.

Would loved to have seen mantle. If the CPU was the bottleneck (4 290x gpu!!) then mantle will have helped.
 

bak0n

Distinguished
Dec 4, 2009
792
0
19,010
So glad they are pushing 4k on ultra-books! Can't wait to try and game with those integrated graphics at native resolutions. /Sarcasm off.
 

Durandul

Honorable
Apr 23, 2013
119
0
10,680
For dual GPU cards, they do address the full memory on the card. One of the few true advantages over having two separate cards, the memory is managed between both GPU's. However, like all multicard solutions, you only get 8GB, because the memory is mirrored on both cards.
@RedGarl, they have HDMI feeding through FCAT to anaylze each frame and have zero impact frame measuring as the processing is done not on the processor.
 
Ok, so I'm looking at the avg and minimum FPS these are putting out at 4K now. Obviously, I want that kind of performance from a single card with single chip for around $500-700. How far away is that? 2020? That's a very rough guesstimation. What sort of die size is it going to take to get that much power from a single chip on a 250-300 watt board? Possible? Well... anything is possible I suppose. Who's got the cash to get it done?
 

piesquared

Distinguished
Oct 25, 2006
376
0
18,780
You don't get it, these results are perfectly acceptable. Sure, someone paying this kind of money expects to have top notch support, and the 295X2 has just that. There might be some cases where a dual card is not needed with these dragsters but the people paying this kind of money aren't going to be playing only those games. The 295X2 Crossfire is there for the games that do need it. Furthermore, these charts are extremely exaggerated, implying that a 2ms frame latency is bad and the little disqualifiers parroted under the graph aren't really worth anything. And then to claim that although 5ms is 'not bad' while 6ms might be 'noticeable'...WTF? lol no one on this earth could perceive a 6ms frame latency let alone 1ms difference.

Honestly, this smells like straight up nv viral marketing campaign. Just like every other Radeon release, nv tromps out the viral marketing machine to spread FUD to willing websites and then "focus" on trolling the comment section to spread the virus. It's getting quite old. I guess it worked for the 7990 so they figured they'd try it again? The problem is, the 295X2 IS a premium product and the best card on the market by a mile. AMD is the choice for the 4K premium and more viral marketing ain't gonna do sith this time.

By the way, where's that titan z NV? lol
 

Mac266

Honorable
Mar 12, 2014
965
0
11,160
1. You say its the best card. How is it? Two GTX 780ti's are better and cheaper as well. Im not saying its not a great card. But it isnt a "er mer gerd ive got to have that because everything else is trash" card.

2. This is Tom's, do you really think they would just send out an article/review without actually testing the hardware? This a product review, not propaganda.

(Actually its not even really a review just someone saying "lets stick expensive stuff in a box and play battlefield" :p)
 

waikano

Distinguished
Feb 13, 2008
224
0
18,680
Chris, I really appreciate the Frankness you put in your reviews or this case the Update. The extra mile in looking deeper and all that is good Troubleshooting skills. Keep making it real and I'll always come back for more.
 

Adroid

Distinguished


AMD with non-optimal drivers? Imagine that...
 

SheaK

Reputable
May 8, 2014
3
0
4,510
I troubleshot Tomb Raider myself - the stuttering was driving me crazy. I had a guess as to what was causing it and followed it to the end.

I loaded AMD's ramdisk utility, and gave it 16gb worth of system ram. I then moved the 11 gigs (or so) of the game on to that ramdrive.

The stuttering vanished completely - keep in mind I'm talking about actual game play and not the benchmark.

If the problem was the AMD cards, I couldn't fix it by loading it on a ramdrive. Also note, the primary drive was a 4xSSD raid 5 that delivered 800mB/s on a regular basis for both read and write, and that somehow still was not enough I/O for the game play smoothly at ultra with a big pixel count.

Running from memory, I was measuring I/O greater than 1.5gB/s.

I'm starting to think drive I/O cannot be ignored when talking about the greater pixel count of 4k. Granted most gamers aren't running 32gb systems today, but I wanted to get my results "out there" because I think it is being overlooked as a possiblity for the stuttering issue. Why? More and more games are trying to skip "loading screens" to keep the game immersive, and we're not always aware of where those break-points are in-game.

Granted, if all the cards contend with the same drive I/O there is some merit to the benchmark, but my tests show that a higher level of performance is available when drive I/O is greatly mitigated by loading the whole game into memory. Again, a video card should not speed up much due to drive I/O on a quad SSD system, but then again, my tests are conclusive as well.

Throwing that out there as a contribution to the community of knowledge.

Best,

Warlord Shea
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165
This entire review is worthless. If you can't run the test systems as would be done in the real world using DisplayPort, then everything you did for this article is completely worthless. Next time, do it right or not at all!!
 

rebel1280

Distinguished
May 7, 2011
391
0
18,780
What ever happened to silicon photonics from Intel? I think that would greatly reduce latency between the various motherboard parts.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960
I'll just ignore the fact that those two GPUs cost more than what I paid for my used car 5 years ago. And the fact that I probably won't be gaming in that resolution or needing a CPU and GPU with this type of performance to PLAY VIDEO GAMES ON until sometime around the early to mid 2020's. One thing that might be remotely relevant to me is that I do like these reference liquid GPU coolers. The newest higher end consumer level Intel desktop CPUs since around Sandy Bridge have become very efficient in terms of heat produced compared to the days of the Pentium 4 and D. Now, any run of the mill 120mm tower cooler can easily keep an OC'd i5 or i7 relatively cool and when paired with any mid-range or up GPU, it is no longer the main source of heat in a case. Even a single mid-high end card like a 7870 or GTX 760 is going to be the largest source of heat during load especially in modern demanding games. Since the cheap closed loop CPU cooler became a staple, most people seem to prefer building a rig with a liquid cooled CPU and an air cooled GPU. Reference GPU cooler or not, I think it should be the other way around. If you only had room for one radiator, I think it would be better if the GPU was the one being liquid cooled regardless if it is being OC'd or not. It would be nice if all full sized, dual slot GPUs from both Nvidia and AMD came with a 120mm CLC like this 295X2. And I like the 2nd fan on the card itself which is apparently for cooling the VRM, VRAM and other parts of the PCB that are not covered by the blocks.
 

709zzy

Honorable
Jul 13, 2013
171
0
10,690
The guy who wrote this review sounds like he has no idea what he is doing and he is just making excuses to downplay the performance of 295x2 in crossfire. Enjoy that cheque Nvidia sent you.
 
Status
Not open for further replies.