• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Tested this game, then that game, and another game , and another game .... and then those adds that insist on covering half the page every 10 seconds or so...

Barely any differnce between the games, what's the point i wonder, seems pretty limited testing for a "Exploring CPU Bottlenecks" .... test...

What about some other applications, a nice DAW that would utilize all cores. or prime...

I read this " it dropped both its multiplier and voltage level under an eight-thread Prime95 workload to stay within its rated power envelope. Throttling artificially curbs the CPU's power consumption, and the big increases we see when the Vishera-based processor is overclocked come from fixed multiplier and voltage settings."

I'm was probably too annoyed by the adds to see any chart on such tests... or are they simply not there ??

A price difference indeed, and with the needed Motherboard i'm guessing the I7 can cost 3 times as much, easily - nice for apples and pears
.. the main feature of the amd (8 cores) is practically left in the dark...

Tom, you've slided hard since i first read your articles 10 years ago, well, apart from the advertisement income i guess :/
 
> But nobody games at 1920x1080 using an $800 combination of high-end cards.

*Sigh* Another idiot who doesn't understand the difference between 30 Hz, 60 Hz, and 120 Hz, and Lightboost.
 
When the article makes ignorant statements like "nobody games at 1920x1080 using an $800 combination of high-end cards." it tells me the author doesn't know the difference between 30 Hz, 60 Hz, 120 Hz, and LightBoost such as the excellent Asus VG248QE or VG278H. For completeness the article should of also tested the GTX Titan so we could get a sense of how AMD and nVidia perform on the same hardware.
 


Only problem with that is the Titan was launched a month "AFTER" this article was made. So only the GTX 680 would of been in this review if SLI was being tested.

Also, Explain to how does Hz of a monitor determine that the Author doesn't know what he is talking about when he only talking about Resolution, FPS, and cost of cards?

I understand that Hz is the amount of times a monitor refreshes the screen and that's the limit on how many frames you can see from any game (so if your getting 80FPS on a game but you're using a 60Hz monitor, You'll only see 60FPS), although I dont see the connection your trying to make.

As for what the Author said, It is true. When using SLI/CF config of high cards such as the GTX 680, HD 7970, ect, 99% chance that your running higher resolution monitors than 1920 x 1080 or your using multiple screens.

Single high end cards setup as such as the ones I list above will run the mass majority of games just fine @ 1920 x 1080 and this has been proven by far, far more people than just here.

 
just wait for the next gen consoles hit the market because its chip it's going to be an 8 core AMD so that is going to force the developers to make games that suport full 8 cores
 
This is a great article showing two things. It shows that the $330 Intel i7-3770K is a superior chip to the AMD FX-3850 that I bought yesterday for $170 on sale at TD. It also shows that there is no real discernable difference between the two when gaming. Hell, I've been playing Skyrim for over a year with my twin HD 4870s and Phenom II X4 965 with no performance issues whatsoever at max settings so I really can't imagine a situation where the FX-8350 would hamper my game play. Of course, this is an older article so the pricing on the FX-8350 was probably higher than what I paid but the cost of the i7-3770K is still $325 which is a full $55 more than I paid for the FX-8350. The fact that I already have a 990FX motherboard only made the decision that much easier. I think that one could make an argument that for gaming, the i7 will be the longer-lived CPU but at the same time, we could also see a performance resurgence from the FX lineup as we did from the Phenom II lineup resulting from games using more and more cores, who knows? You can't really go wrong with either of these chips.
 
This is a great article showing two things. It shows that the $330 Intel i7-3770K is a superior chip to the AMD FX-3850 that I bought yesterday for $170 on sale at TD. It also shows that there is no real discernable difference between the two when gaming. Hell, I've been playing Skyrim for over a year with my twin HD 4870s and Phenom II X4 965 with no performance issues whatsoever at max settings so I really can't imagine a situation where the FX-8350 would hamper my game play. Of course, this is an older article so the pricing on the FX-8350 was probably higher than what I paid but the cost of the i7-3770K is still $325 which is a full $55 more than I paid for the FX-8350. The fact that I already have a 990FX motherboard only made the decision that much easier. I think that one could make an argument that for gaming, the i7 will be the longer-lived CPU but at the same time, we could also see a performance resurgence from the FX lineup as we did from the Phenom II lineup resulting from games using more and more cores, who knows? You can't really go wrong with either of these chips.
 


Not sure where that logic developed from but your conclusions are utterly wrong .

What we can conclude is that the cpu is not currently the primary indicator of a computers gaming potential . A cheaper AMD cpu will work just as well when paired with any graphics card as a much more expensive intel would .
And from that we can see that you would always be getting a better gaming machine by paying $190 for an FX 8350 rather than $325 for a 3770K because you would have $135 extra to spend on your graphics card [ either Radeon or nVidia ] and that will give you real gaming performance benefits
 
I know this is a bit old of an article, but in the game World of Tanks, at MAX settings, I get 120FPS with my 660Ti and i5 3570k@4.5Ghz. When I use an FX8350@4.5Ghz with the same 660Ti, I can only run the game at Medium settings, and I only get 60FPS. Anything over medium causes stutter.

A LOT of games are still like this. If you play alot of older games, it's important to have at least an i3. World of Tanks uses 2 cores. They have a lot more games out there than just the newest blockbusters. And AMD CPU's SUCK at running them.

I got my nephew an 8350 cause I was looking for something cheap. I wish I would have just gotten him an i5 3470 instead. Cause even overclocked the 8350 can't hang in gaming at 1920x1080. Only games like BF3 can't tell the difference. But many, MANY games CAN.
 


I wouldn't say 2 cores..... It uses one core fully and (suppose too) uses 1/2 of the second core. http://ftr-wot.blogspot.cz/2013/02/1622013.html?m=1

Although even on my core i7 920 (stock clocks) with an AMD HD 7850, I haven't actually seen it even use a second core/thread or if it does, it's so low usage that i cant tell if it WoT using it or one of my other programs such as NextPvr recording a TV show.
 


It seems Mantle has answered this question. There is a 40% DirectX CPU overhead as well as an issue with DirectX in how it handles Multiple Cores. (on the rendering side there is the DirectX issue with small batches as well). It's not efficient and rarely scales properly beyond 3-4 cores.

So yes. There is such a thing as processor bound gaming. Especially with Multiple GPUs. It would logically affect AMD moreso than Intel due to the FX 8350s higher parallel architecture being shunned by DirectX. A more serial architecture (fewer stronger cores) would benefit and that is exactly what we're seeing with how Intel's architecture handles the same workloads.
 
Status
Not open for further replies.