CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Another failed bench. Ancient drivers, obvious intel i7 is faster. Why bother posting this? No one is going to use old ass drivers, and everyone already knows that the high end radeons need superhigh res to stretch its legs. Intel platform has giving better fps especially at low res for years in case you have been have been living under a rock. The fx is still attractive for price versus an i5 and the keplers are energy efficient but I'd still rather have the gcn radeons for overclocking.
 
That International CES Innovations 2013 Awards Honoree sure has a rough go at it against a 7 year-old chipset ...
AMD-790FX.png


:ouch:

Is this the New Toms Hardware Fan Boy Rorschach Test ?

:lol:




 
[citation][nom]lolwhat[/nom]you always try to say go Nvidia, then one month latter you say "best graphic card to buy isss 7970 trolololo"[/citation]Did you even stop to consider if there's any truth to that statement before posting it? Look at the SBM. Tom's Hardware editors use Radeons more often than not, this specific article stated that the Radeon 7970 was the better card, and the only one way that GeForce got recommended was when using an AMD CPU.
 
[citation][nom]ericjohn004[/nom]I don't see how anyone can not understand this article. It's very simple. And the last sentence in the article says it all. And Tom's is correct yet again.You can't read this article and tell me that Tom's wasn't completely scientific and did these tests with any biased. If they did, they wouldn't be Tom's Hardware.I was just reading a comment and this guy says that Tom's is are only using games that only use 1 core, and that they should change this so results wouldn't favor Intell. Come on now, All of these games including Skyrim use at least 4 cores. So just so they don't use 8 cores(some of them I'm sure do)Tom's shouldn't use them because AMD wouldn't run them well enough? And not to mention none of those games only use 1 core. Should Tom's cherry pick games just so AMD can MAYBE run as fast as Intel? Or should they pick the appropriate games? It makes absolutely no sense to go cherry picking games that are 8 core. They're are probably only 5 of them out anyways. They wouldn't even make up the whole suite of benchmarks. And they probably aren't demanding enough or aren't well optimized anyways. So the results would be the same anyways.[/citation]

should readers dumb themselves down to follow your logic and grammer? I'm not saying the guy wasn't dumb, but calling him out by being dumber is a tough sell
 
AMD's problem with their CPU line is that bigger is not always better. especially when your competition is designing 3D architectures compared to their two dimensional designs. To program the same instructions in a 2D framework compared to a 3D framework, you need to make your processor exponentially more powerful than the 3D. Now, a slight caution, is that the design if Intel is not true 3D, but only the first step in that direction. Who will be able to produce a true cube or sphere processor to fully realize the limits of current technological design? Right now, I'd put money on Intel, but non-linear breakthroughs are not only possible, but called for in this new dimension of processor design.

The future likely will see a processor that integrates all graphic and central processing on one 3D chip along with RAM and SSD designed about a central core.

Getting back to the subject of this article, it is no surprise that AMD's graphics require greater CPU
power, as Nvidia has always incorporated more compute power in their designs than AMD. AMD's processors are the over muscled weight lifters who get out wrestled by the flabby guys who know leverage and technique.
 
[citation][nom]sarinaide[/nom]Its like I am reading my own posts As has been stated a few times now, this is not about ultimate processor performance, this article is about whether the FX 8350 being the flagship and likely resonate through the family fundametally runs SLI better than Crossfire. The long and short is yes Nvidia cards in single or multi card setups are faster on a FX part than AMD single or multi card equivilants. This is not a terrible thing as it has been shown that the AMD cards require more muscle to be pushed to their potential while NVidia is better at extracting CPU performance. This is not another AMD FX is bad thread so don't confuse it as one, the i7 is just a control test and by using the i7 you rule out whether you are using a top end control or not, this isn't about which part the fX competes against this is purely a scaling test to prove whether SLI out performs CFX on a FX platform.[/citation]

I know this is about the graphics cards. But it still does not change what i said and meant when i said only testing one of the cpu's against an i7 with no other amd cpu's being tested tells us nothing about amd compatibility being better with one or the other. All it shows is the result of this one test with this one cpu and only 1 card from each side in which many other factors could be different with lets say the 6300 as one example.

A scientific approach is one in which all variables are tested. Using one card from amd and one from nvidia and one from each cpu manufacturer and comparing how each behaves with the same card. Is not scientific or any where close to complete.

To be complete you need more variables such as:

3 video cards from each side and at least 3 cpu's from each side to be complete. This way you can tell if its just that one cpu preferring that one card over another or not. This has very few variables being that its only one card from each side and only one cpu from each side and the cpu being used is not one that is purchased a lot for gaming at all as most go with an i5 not an i7. So its gaming capabilities and its results are kind of moot.

That is why i say the results and this article are both flawed. It has nothing to do with fanboyism or anything like that. Its pointing out the flaws of the testing and the variables being tested.

An example of how flawed the test is like a car manufacturer testing a corvette against a lamborghini in one test using 2 different tire manufacturers tires and trying to say that the one car favors the one tire manufacturer over the other. So no this is not accurate at all no matter how you look at it.
 
AMD will always be my choice. i despise Intel and their business practices. pond for pound AMD is better. Why you ask, look at the $$$. when the playing field was even AMD was the leader. AMD forever!!!
 
[citation][nom]chesteracorgi[/nom] The future likely will see a processor that integrates all graphic and central processing on one 3D chip along with RAM and SSD designed about a central core. [/citation]
Essentially the basis of a microcontroller.
 
I loved the review, great insight.

Now if only our monitors could take advantage of these great frame rates. I want a monitor that will do 85 or greater hertz vertical refresh, and I don't mean the flashing trick that all different manufacturers talk about with their back plane strobing at 120 to 240 hz to give the effect of faster refresh. I want to be able to actually SEE these high frame rates!

If you find a monitor that can handle that at say 1920x1080 or higher, sub 20ms input lag, and for under $1000 which isn't the classical CRT, please LET US KNOW!
 
Or save money $$ keep your old CPU's and beat both graphic scores. Since thats where it counts
My i7 930 @ 3.8 ghz
Just ran the same tests with the same version of 3dmark11 1.0.3 with gtx 690
Performance preset (graphics): 20064
Extreme Preset: 6449

Should note, beat the amd physics scores as well 8020 & 8017 for both runs. (consistent)
 
[citation][nom]bigpinkdragon286[/nom]So you have a control group of 1 and a test group of 1, and your control group effects it's own test results? That hardly sounds scientific. With that much logic and reason, why not consider the 8350 the control and i7 the test? The article could easily have been titled, "CrossFire Versus SLI Scaling: Does Intel's Core i Actually Favor AMD's CrossFire?"For all of the back and forth about how the FX should have been compared to an i5, I actually don't see why that's relevant, but not for the reasons already proffered. If this is an article about SLI versus CrossFire performance on AMD's FX platform, why do we care what it does on Intel's platform? Mere curiosity, or is it a "feel good" thing?Why not just do an article about scaling on both platforms, instead of trying to shoehorn a conclusion that isn't very scientific onto AMD's FX platform? Saying the FX platform actually favors SLI is as though the FX branded equipment has some sort of choice in the matter. It makes more sense to say that SLI performs better on an FX platform than CrossFire, but that still doesn't answer the question of value. I may still get better performance per dollar on an FX platform using CrossFire, versus whatever other combination, so why not look at that aspect?I do not dispute the test results, I just find the article could have been constructed in a less biased and more useful fashion.[/citation]
Hmm. Lets see.

*Test of Logic for the Masses*

For example,

There's this game/benchmark that's GPU limited.

Single GPU for the i7 shows us, 7970>680 for a particular test. For the FX, we see the same result, with almost the same FPS for both cards.

But the moment we switch to dual-GPU, CF maintains the lead on the i7 (as is logical) but the 680 SLI outperforms CF on the FX.

On comparing FX-SLI to i7-SLI, we see that the result is proportional to the single GPU difference or is the same (i.e. there's a tiny difference, between SLI on both CPUs), however it is not so for the CF, and the difference is much more between the two processors, with the i7's results higher (as can be inferred from the context above).

What's the conclusion from the above example?


Off-topic p.s. Filtered FCAT would have likely shown Intel/Nvidia in even better light.
 
[citation][nom]sarinaide[/nom]Well some don't get it, they want a i5 3570 and in the end the exact same thing will be shown. That the FX favors GeForce to Radeon, go figure I don't mind fanboism preference if perfectly normal, hell I am probably the biggest AMD fanboi here but when a simple article gets turned into another but thats not a 3570 which its priced to compete against thread.So just for troll value "DOES FX PREFER GeFORCE TO RADEON" heck I got that from the title.[/citation]
Sure, sure. Nothing wrong with being a fanboi. You're an AMD fan, I'm an Intel fan. But then that doesn't get in the way of seeing facts. If i had a reason to pan Intel, i would. I pan nvidia when i have to, and in fact i usually don't pan AMD for anything because i love the HSA stuff they're doing.

But it's painful to see the reactions of people on this site. I can't remember when i last read a comment thread that consisted mostly of logical discussions, interesting speculations and constructive criticism or exchange of data. Actually wait, i think that was the Haswell preview thread...

You know, the first time Thomas did this article, he had only compared the processors, keeping the 7970s constant. At that time i had argued against the article, saying that it doesn't really say anything about CF, because there's no comparison to anything else. Obviously, the i7 wins in CPU-bound tests and they'd be about the same in GPU bound tests. Nothing to prove really.

Then someone said, could we please have SLI as well, and then Thomas was like yes, that's a good idea, i'll do that.

Now that he has, and the test is thorough, people are going on about how it's biased and where's the i5 and bla bla bla.

People ask, why not just compare FX-SLI vs FX-CF? Because, we say, then how do you know that if SLI is scaling better than CF, it's an SLI thing and not because of the CPU? So you check how the same setup behaves on a non-AMD CPU.

But because we're not really interested in seeing how the games are bottle-necked by CPU with respect to the game it self, rather the overhead related to the graphics pipeline, and how the CPU/platform deals with that, with GPU bound games, and there's no need for an i5, since it wouldn't change a thing.

If everyone remembers how i5s do vs FX CPUs in gaming benchmarks, you'll remember the trend is the same! FFS! How does it change anything?

Just to be clear, the above is for people who didn't get the article.

I hope this helped. If it didn't, god help you, not going to try and explain again.
 
[citation][nom]unknown9122[/nom]What is up with the old drivers. This test cant be legit until the drivers are up to date for both AMD and nVidia.[/citation]
They don't have the radeons anymore. Read the article.

And that's also why Crysis 3 isn't included, because it hadn't been released when the older benchmarks were done.
 
i7 IB @ 4.4Ghz has 19% advantage over 4.4GHz FX (hardly top clock you can pull of from it) when both are running GTX680 in SLI. 19% is just not that much, especially when you consider that FX can clock to 4.8-4.9Ghz on good air cooling and is more than 100$ cheaper to start with. Pretty solid showing of FX to be honest.
 
An I7 3770K which is 150$ more expensive than FX 8350,Performs 10% better with single GPU and with Dual GPU 20% better.not to mention the 990FX uses PCIe 2.0 while Z77 uses PCIe 3.0 also Z77 is 60$ more expensive than the other one!shouldn't it have done atleast 50% better?the only time it really out shined FX was in Skyrim...also they should do another one with the military grade version of Sabertooth 990FX Gen3/R2.0 which has PCIe 3.0 i can say then the difference would be much less than now!
 
Similar effects will be seen with slower processors from AMD or Intel. AMD's drivers utilize more CPU cycles, which is most noticeable when comparing something like folding@home.

The recent factors against Nvidia that were not mentioned are their artificial limits to overclocking in their drivers after 296.10 and the reduction of Cuda processing abilities.
 
So the FX pulls almost identical results to the i7 that costs over $100 more,,, Makes me wonder what a $320 FX might do.. Judt food for thought don't get you intel brand panties all in a twist
 
I think they continue to use it as it is a popular rpg. And as such people wanting to do a new build or upgrade their rigs to play skyrim would want all knowledge they can get about it. Not the best reason i know but still it is probably the main reason why its added on almost all benchmarks.
 
[citation][nom]cmi86[/nom]So the FX pulls almost identical results to the i7 that costs over $100 more,,, Makes me wonder what a $320 FX might do.. Judt food for thought don't get you intel brand panties all in a twist[/citation]

[citation][nom]cmi86[/nom]And stop friggin using skyrim ! we all know it favors intel because it was built to do so, so how are you going to keep using this title is benchmark comparisons ?[/citation]

This isn't a CPU test and the CPUs were intentionally put into different graphs. It was a test to see if cf7970s scale as well as sli680s on with an 8350. The i7 is there to rule out the possibility that cf doesn't scale well on that CPU. While I would like to see more data with more CPUs and more GPU combinations, the article shows evidence that cf scales worse than sli using the 8350.
 
Status
Not open for further replies.