The Cause Of And Fix For Radeon R9 290X And 290 Inconsistency

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

anthony8989

Distinguished


Hindesight is 20/20. The vast majority of information on microstuttering became widespread after the release and adoption of FCAT in reviews. I unluckily invested in crossfire before that. Microstuttering was only starting to be discussed in reviews. In addition all 4 reviews I read on 7790 crossfire at the time.mentioned nothing about it. I had to learn the hard way. So F***.
 

anthony8989

Distinguished


And yeah buddy, its my fault the manufacturer advertised crossfire as a great way to take a mid level card and nearly double its capabilities with a partner card. Shame on me for believing AMD.
 

fanman

Honorable
Oct 9, 2013
10
0
10,510
Most of the time R9 290x will used in Quiet mode and if needed the buyers shut down and open their case and switch to Uber mode and close the case again because all of the buyers value their ear. After that if not needed they shut down and open the case and then switch to Quiet mode again and if needed the buyers shut down and open their case and so on and so on ....the process repeat and repeat again.
 

slomo4sho

Distinguished


Micro-stutter has been a known issue with multi-gpu setups for years prior to the release of the 7790, it didn't just rear its head between the release of the 7790 and the adoption of FCAT in reviews. Somehow even this Tom's article from 2011 named "Micro-Stuttering And GPU Scaling In CrossFire And SLI" concluded that:

CrossFire With Two Cards

If you're only judging based on average frame rates, two cards seems like a great deal for the price. We've had several readers write in, though, complaining about this micro-stuttering issue, which simply cannot be seen in the context of normal benchmarks.

Even at frame rates above 50 FPS, micro-stuttering rears its ugly head, pronounced enough to significantly detract from the gaming experience. A paradigm shift seems necessary, at least until both AMD and Nvidia are able to prevent or mask the artifact. Right now, if you asked us whether it'd be smart to "go cheap" on an inexpensive card and double-down later with another one, we'd have to suggest against it if you're the sort to be bothered by micro-stuttering. The improvement in performance would be negated by the phenomenon's impact. Currently, it seems like cards less powerful than the Radeon HD 6950 are not well-suited for dual-card CrossFire. Even if the frame rates look decent, the slower the GPU, the more pronounced you'll see micro-stuttering during gameplay.

At the same time, not everyone is equally sensitive to time-skewed frame sequences, and quite a few cheap TFT LCD displays help hide the effect. Even so, AMD has a major undertaking ahead of it in order to really improve the dual-card experience.

Poor you indeed. Sadly, there is no cure for stupidity :pt1cable:
 

DrKlahn

Honorable
Nov 5, 2013
24
0
10,510


So your opinion is that because the issue exists, the manner in which it was exposed is inconsequential.

I see it differently. If you are going to even insinuate a company is stacking the deck and shortchanging the consumer then you report on the issue in detail. When the first article ran I predicted this whole thing was just an issue with the fan being to conservative on the retail card. Guess what, it was. I did not guess that a flagship graphics card was being purposefully tested in its conservative mode vs. its competitors. That still floors me.

I think part of the reason that this was presented as it was because if the all the data was presented upfront it would have elicited a collective shrug from most of the enthusiast community. Which was already adjusting clockspeed, voltage and fan targets to get the most performance from the card.



Well I don't see how the Golden Sample insinuation is not a borderline accusation of fraud on AMD's part. I don't really see how you can interpret it much differently. How did you read it? The time frame of the article, followup and fix were fast enough that it probably won't have much net effect. Although there are some people still posting about AMD cheating in this thread, which shows you that they aren't grasping what the issue really was and just digesting the headline.


As far as benchmarking the card "in the best light". How about flipping the switch to the performance setting and not the acoustic setting? Tom's reported on the switches function in their initial 290X article, so they seem to have this knowledge already.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Worked up? So stating what your purchasing habits are and why, is getting "worked up"? I have water, Koolance which is noisy compared to what I run today (It's an older one Exo2 I think) and a good air cooler catches it. The 3 fans never were as quiet as I wanted (my dad's either).

I see no non ref desigsn for AMD and refuse to void warranty to get one right now. I don't want to mod my card (whatever it is I buy next). I never said Non-Ref and water wasn't better (though it depends on your water kit...LOL).

https://www.youtube.com/watch?v=m1JOhT015ww
Not faster than 780. ONLY in ONE res they ran in with all tested cards clocked to max possible any way they could get there "OC to the wall" they call it. They do what most do (if they overclock at all); come home and find out the max then call it a day, no warranty harm done (until they come out with difference coolers, this is your option). Benchmarks at 8:35 or so.

Aftermarket won't do much other than allow them to score the same (maybe a little better) without running the fans (on all tested cards) at 100% to get those max clocks. Which should get better thermals and noise, but not much more perf wise, without extravagant cooling (water, ln2 etc). Toms shows what a $75 fan mod does too, again, 13-20%, so don't expect more than toms or linustechtips vid shows.
 

anthony8989

Distinguished


Didn't think I'd go to the link? LMAO.

" CrossFire and SLI only make sense from the mid-range and higher, with a slight advantage for SLI." - 7790 = mid-range = makes sense.

You're just cementing my point that AMD's drivers are garbage, and have been for years now. But I suppose your own stupidity makes you ignorant to that fact huh.

"AMD CrossFire™ technology is the ultimate multi-GPU performance gaming platform. Unlocking game-dominating power, AMD CrossFire™ harnesses the power of two or more discrete graphics cards working in parallel to dramatically improve gaming performance.1 With AMD CrossFire™-certified AMD Radeon™ HD graphics cards ready for practically every budget and the flexibility to combine two, three or four GPUs, AMD CrossFire™ is the perfect multi-GPU solution for those who demand the best."

So in addition to their sub-par drivers, they also lie to their customers. You would think with a pitch like that, they'd actually try to back it up. What a fantastic company. Stupid was ever believing they could make anything more than sh*t. Had I gone with GTX 650 TI Boost's in SLI for $20 more I wouldn't even hate AMD's crap products right now.
 


No, the HD7790 is NOT mid range. That goes to the HD78xx/R9 270X.

The HD77xx is entry level GAMING cards.
And even lower is HD66xx., but most people see those as CASUAL gaming cards or not gaming cards at all, more HD content and multi-monitor enabled cards.

At least that is the general opinion and I agree with it.
 

anthony8989

Distinguished


I can see where you're coming from. I'm just going by Tom's placement:

http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-3.html
 


MY and many others opinion. Not the website.
 


I guess AMD don't share your opinion then as that's what they are having the card sold as.
 

anthony8989

Distinguished


It was in the context of slomo's argument that according to a Tom's article, low end cards made for poor dual-card setups due to exacerbated micro-stutter. I was emphasizing that Tom's refers to the 7790 as a''mid-range'' card and therefore it would ''make sense'' to run in crossfire.
So in this context the 7790 is a mid-range card. If you have a 7970 or a GTX 680 it may seem like low-end or entry level. But if you have an intergrated HD 4000 then it looks pretty mid-high. It's all about perspective.
 
So perhaps, instead of arguing over the subjective tier placement of a particular graphics card, why not be more constructive about things and infer that the greater the performance of the cards, the less evident the micro stutter will present as.

Perhaps micro stutter appearance is pretty uniform from high end to low end, and you simply see the same amount of micro stutter, but as a greater percentage of overall frames being drawn the less frames you have? Just a quick thought, I've done no investigation, as I'm not really concerned. The issue is mostly moot, with both hardware fixes for new cards and software fixes for older cards in place.

Personally, I think a few of you are just looking for a technical justification in your derogation of somebody else, which doesn't speak well of your own character.

Please, let's add to the topic at hand, not prove who the rudest person is.
 

anthony8989

Distinguished

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510


Non-reference Graphic Cards are basically reference Graphic Cards with minor tweaks, better Cooling Solutions, and tuned to OC higher frequencies from the manufacturers at stock spec. At the heart of the matter, a reference card and non-reference card are almost the same. The only real, unclear difference would be the bining. Take for example, the Asus Platinum Edition 7970 Matrix versus a regular 7970 GHz Edition Card. Truth is that the Platinum Edition Matrix from Asus are better binned GPU on the PCB. So they squeeze out a few more hz in core frequency and memory frequency out of the box for a smaller amount of voltage tweaks. The golden egg about vanilla cards, or reference cards is that they typically OC higher than non-reference Graphic Cards. Right now the RX9-290x has a max Core clock % of 50%. That's 50% of 1000 Mhz, or 500 hz. So in theory, it's plausible to say that reference cards, with the right cooling solution and enough core voltage, could push clock frequencies "easier" on reference cards, at 1500 Mhz Core, versus non-reference Graphic Cards.

Personally, I think AMD did this on purpose: Gave crap-tastic air-cooling solutions for RX9-290x because they knew; throwing business to other vendors, that AMD Consumers would probably go with water-cooling solutions to combat the high 96 deg C temperatures. With water-cooling blocks from Koolance and EK, that 95 deg C will probably drop down to 35 deg C at stock frequencies. Push the RX9-290x to it's max, and you'll see some 60ish deg C temp at high Core Frequencies for the enthusiast of enthusiast...
 

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510


You're point about the AMD CrossfireX would seem more valid a few months ago before the RX9-290x and RX9-280 were released. At this current point in time, your claims are not valid anymore. CrossfireX through the PCIe-bus has improved the situation, and it wouldn't surprise me if NVidia's GTX 800 Series aka Maxwell will have a similar copy-cat of their own for SLI.

If you look at the Benchmarks made by Hilbert Hagerdoorn, Guru3D.com, on the NVidia GTX 780 Ti in SLI reviews, it shows that the new, best single GPU solution in SLI, has a tendency to drop frames in some games. It's competitor, the AMD RX9-290x doesn't. In addition, it also reinforces the idea that SLI doesn't scale as well as CrossfireX does. While the scaling ranges from 1.5 to 2.0 times the FPS of 1 AMD RX9-290x in most PC Games, GTX 780 Ti scales in SLI, between 1.1 and 1.5 times a single GTX 780 Ti, as the resolution increases. It comes to show that PCI-e Based CrossfireX has been a significant improvement for AMD. I'm still waiting for reviews on Tri-Fire and Quad-Fire RX9-290x. Ya there's going to be more diminishing returns, but I suspect that it wouldn't diminish as much, performance wise, with a similar setup of GTX 780 Ti. Chalk it up to CPU Bottleneck, whatever... AMD reigns supreme now as the multi-GPU solution while NVidia took/retain single GPU solution superiority. Well NVidia retains it with the GTX 780 Ti, but they lost it when the RX9-290x basically defeated GTX Titan and 780 in most PC Games, late last month.

You can talk smack about the AMD Driver all you want. Not even going to lecture you the difference between NVidia SLI and AMD CrossfireX AFR. Truth of the matter is the Beta Drivers are working. The AMD Beta Driver is keeping the AMD CrossfireX mini-guns in check with it's Frame Pacing software. Beta Drivers are doing the needed job on a software level for AMD Consumers. Enough said. Bash it all you like. The more you bash it, the more other people will start to realize that AMD has the upper-hand for now... NVidia Fan Boi who are brain-dead, will continue to cry and bash AMD while NVidia sales slowly start to decline over 2013 Q4 and 2014 Q1-Q2.
 

anthony8989

Distinguished


Serpent your approach is refreshing. Nvidias and AMDs stock portfolio aside, I was left with a sour taste with AMDs products and thats all I ever expressed. AMDs drivers had a big problem when I owned their gpus. In an attempt to dismiss that fact, I was called stupid. That is what had happened. Im going to stop caring about this now, so to all who disagree with me; your posts fall on deaf ears.
 

Darkresurrection

Honorable
Sep 15, 2013
721
0
11,160


who on earth buys 7790 for hardcore gaming!? and then you accuse AMD!? do you really expect the 100$ graphic card do some magic for you and wash your clothes!? I mean do you really read before buying anything? the bandwidth is 128bit, it has only 896 stream processors so anyone with a sound mind can say, it is an entry-level graphic card, the same thing can be said about gtx 650 doesn't nvidia call gtx 650 a mid range graphic card? http://www.pugetsystems.com/parts/Video-Card/EVGA-Geforce-GTX-650-1GB-8891 7790>650 Nvidia calls gtx 650 card mid-range so that is ok for nvidia but not for AMD? what a double standard!!!
 
Status
Not open for further replies.