AMD CPU speculation... and expert conjecture

Page 209 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished
i3's can't keep up with the FX 6300...they won't hold a candle to the 8350.

Comparison:
http://us.hardware.info/reviews/3314/23/amd-fx-8350--8320--6300-vishera-review-finally-good-enough-fx-8350-vs-i5-3550--fx-6300-vs-i3-3220

Crysis 3 FPS numbers...note the i3 with a GTX 690 can barely break 30 FPS...that isn't fine...

0nIkCAb.jpg


This shows core usage...the i3 is clearly a bottleneck...the FX 4300 is at full usage but just barely not a bottleneck. While the 8350 is at about 70-80% usage average between all the cores.

LL

LL


EDIT: If you consider the 2600k is using hyper threading and pulling resources from the real cores...each core usage average is 89% on the 2600k, which is about what you should expect to see on a 3570k for all 4 cores.

The 8350 is at 75.8% utilization across all 8 cores average, meaning the 8350 has headroom, where the only intel with similar or greater headroom in the data above is the 3970x, and it has greater headroom than the 8350, but by the time you factor in hyper threading, the margin is less than you might think at 50.5% utilization.
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


Those graphs are so old. Like march old. The graph i linked was from a review done a few days ago.
 

8350rocks

Distinguished


The FX 4300 is 4 cores, and it's not a bottleneck just barely...2 cores are not enough in Crysis 3...it wouldn't matter if they were clocked at 5.0 GHz, they cannot do the workload sufficiently...HTT or not. The 2600k is at 89% usage with HTT on a quad core. The FX 6300 is also at 80+% average utilization with 6 cores. Crysis 3 cares not for clock speed or IPC, it cares about being able to process the high number of threads running concurrently. Can you run it on a top end i3? Probably...with a $1000 GPU you would likely see frame rates into the 40 FPS range (@3.6 GHz Hasfail i3)...but if you were going buy a $1000 GPU to run Crysis 3...you wouldn't buy an i3 now would you?
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


Crysis 3 is only used as the example as it is the first of a new generation type. Crysis 3 still had to be able to be played on the PS3 and Xbox 360, therefore it was restrained a bit from what it could have truly been. Once studios no longer have to make the games run on the old PS3 and 360 then we can see games truly designed to run on multi-core systems. With 8 real cores and at least 16 threads to work with the new games are not going to work very good with any i3 processor (2 cores and 4 threads). Granted the truly multi-threaded games (running 4-8 cores, 8-16 threads) probably will take another 2+ years to appear as studios will have to learn how to code and program for the new consoles strong points, and I'm sure that they will want to produce PS3, Xbox 360, PS4, XBone games for at least a year after the new consoles hit the market to maximize their profits, which will further delay them optimizing for the PS4 and XBone.

i3 2.9 GHz Haswell do how well vs FX 8350s in Crysis 3? i3 2.9 GHz Haswell may be able to run Crysis 3 at lower settings, but they can't compete in the same league as the FX 8350, nor will a 3.6 GHz i3. Dual core gaming's days are numbered, and anyone who thinks they are going to have a "future proofed" i3 gaming rig is just buying into the hype.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


If the game loads 6 or 8 cores to less than 50% then that is not taking advantage. If the game loads the cores above a 70% then it is taking advantage and AMD performs better. Consider Eurogamer recent poll: all triple-A game developers recommend the 8350 as best gaming cpu for future games.



The 8350 destroys the 2600k/3770k/4770k in multithreaded tasks. In fact a 'cheap' 8350 outperforms an ultra-expensive six-core 3960X in some multithreaded benchmarks. Yes, Intel will be releasing 8-core chips, but AMD has released centurion (18% increase performance) now and then Steamroller (>30%).



Benchmarks show that 4-core jaguar (4 threads) are faster than an i3 (4 threads). The 8-core jaguar (8 threads) in next consoles will compete with i7 (8 threads) PCs. Precisely all the PCs used in PC-console comparisons/demos use an i7. It is not casual.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


A 5Ghz i3 wouldn't be that bad actually. It calculates out to 55fps.

I think we all know dual cores are on the way out though. Even the budget chips (Jaguar/Atom) have 4-8cores.
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


I think you have the wrong idea of what gaming across 6-8 cores means. BF3 stresses my 4 cores to around 42%. IF the game were developed to use all 8 cores CPU usage would drop on the individual cores. You're never going to see that much usage on an upcoming enhanced game unless you're gaming at 1024X768.

At 1080P and above the GPU is doing the majority of the work
 
If Intel had the guts to give us an i3 with 4Ghz+ clock speeds, then they'd be out of business, haha. Old games don't use more than 2 cores heavily, so i3s are "good" enough thanks to HT and cache (AVX support as well, since Pentiums and Celerons don't have it AFAIK).

Also, that's a very interesting point, juanrga. Current i5s K series will be the lowest CPU you'll want for games thanks to high clocks and 4 core count. I wonder how many Hertz you'll need to OC the i5s to be on par with the 4M/8T 8320s and 8350s down the road.

Cheers!
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


Wow ... thats all I can say ...

power-peak.gif


haswell to sb ... reduced load power from 112W to 102 ... a whopping 10W or 10% reduction ...

Now take that 10% power savings to SB-E ... 189W -10% = 170W for a 6-core HW-E ... pretty close to the 8350 ... add in 2 more cores and your back over the 8350 power draw.

As per your a10 ... wow ... 1 watt less than the a10 6700 ... thats soo massive, but idle power difference of 4 watts blows Hasbeen away if 1 watt is considered "a lot less".

power-idle.gif


epic fail: pretending hasbeen is the greatest thing around.

If you've been following our IDF Live Blog you've already seen this, but for everyone else - Intel gave us a hint at what Haswell will bring next year: 20x lower platform idle power compared to Sandy Bridge, and up to 2x the GPU performance of Ivy Bridge. http://www.anandtech.com/show/6262/intels-haswell-20x-lower-platform-idle-power-than-sandy-bridge-2x-gpu-performance-of-ivy-bridge

from TR's chart, discrete gpu on sb - 45W, idle on hasbeen with igp ... 27W ... where does this 20x come in? maybe if you overclock sb and disable idle speeds. ROFL, even with that sb would need to draw 540Watts (20x27W).

 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


Has anyone properly tested a locked I5 with a Z87 board? I want to see if the Bclk overclocking is indeed locked
 


first of all did you read 8350's post? if you looked carefully the i7 2600k is pushing 80% on all its cores, then there is some data for the virtual cores which use the 4 REAL cores' resources... so yes 89% or so is not much of a far fetched estimation. The 45% you are claiming is from the 3970x and he wasn't even talking about that one.

the i3 can cope with a lot of games... for now. but you really think dual core processors will be able to keep up with games for much longer?
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I will ask again: are not you tired of posting nonsense daily?



LOL. Look to the benchmark. They selected some section of the game which is poorly threaded. Look to the six core i7s performing the same that the 3770k. The difference with the i5 is explained entirely in terms of clock. Or look to the FX 4170 core being faster than the FX 8150 core.

They clearly benchmarked some section using 4-threads or less. Not a good example of the performance of next generation games. Sorry.

For example of AMD performance in multithreaded games, look at gamegpu benchmark given above or look to this

648x808px-LL-bbc9ae66_Crysis-3-Test-CPUs-VH-720p.png


You can see how the i7 six core is much faster than the i5. And how the FX-8 is about twice faster than the FX-4.
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


That was published back in February. Crysis Patch's have made some changes

Test-Intel-Dualcore-Haswell-4570T-Crysis-3-pcgh.png
 

Intel God

Honorable
Jun 25, 2013
1,333
0
11,460


People say "hasfail" because they're just mad they cant afford it :lol: ;)
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Considering that the new GTX 700 series isn't on there I can assume that this is ~more than 2 months old. Try to find something that is updated.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


I feel like all your doing is one of two, or both:

1. You are pulling random information (that you may think is true) from the back of you brain.

2. You are using ~ information that is irrelevant to the model in discussion to prove a point.

Stop doing what I used to do when I was a noob, and find some real information, like I showed you a couple pages back.
 


DOES ANYBODY CARE ABOUT POWER FOR THE LAST FRIGGEN TIME!!? Haswell is not bad, but it is lackluster for the Desktop performance upgrade, Iris Pro is basically a bit worse than 7660D with a jacked up price tag once it gets to 1600x900 or above. Just stop putting nonsense about power consumption and performance per watt (Use an icebox instead of a refrigerator if you are so worried) and understand this thread is comparing Steamroller to Piledriver with the occasional mention of Haswell. The Intel fanboy invasion seems to have started when GOM3R (no offense) was linked by 8350rocks.
 


To be honest, NetBurst was not that bad, it was just sticking with it with Prescott and its eventual demise with the Pentium D. Northwood was actually pretty good to think of it. I would take a Northwood 2.8C all day over a Pentium M@1.8 with a 479 to 478 adapter. Stop posting AMD-smearing content and overexaggerating power consumption, it is getting annoying.
 

8350rocks

Distinguished


Just checked, the top of the thread says AMD still??!? So, wondering why is he in here anyway?!

GOM3R may be somewhat annoying, but at least he is listening to information and absorbing the technical discussion about AMD architecture...he has gotten off of the top of his Intel pedestal and is now standing on the 2nd step from the top...LOL. I can stand someone looking at valid information and actually asking relevant questions...I cannot stand someone spouting nonsensical rubbish about power constantly.
 

Mitrovah

Honorable
Feb 15, 2013
144
0
10,680
does anyone have educated idea as to the range of watts the kaveri apus may use? considering the increased graphic power which is more than a sure thing?. some have speculated betwee 100 and 65ww but if the graphics are most certainly more powerful than the richland trinity how can it stay within that range. Im a newbie but it seems to me all the speculation would suggest the apu will be more power hungry than the rich a10
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


A 40% faster clock per clock

http://gamrconnect.vgchartz.com/thread.php?id=159069



What I said is that a 8-core jaguar in a console will compete with an i7 in a PC.

The 8 thread FX-8350 wins most multithreaded benchmarks compared to the 8 thread i7s and can outperform even a 12 threads i7-3960x in some few benchmarks.

Dual cores are outdated for gaming (unless you are only playing older titles) and quads will be next year.
 

8350rocks

Distinguished


Because of the process improvements and the node shrink, the expectations are that the TDPs for the new Kaveri APUs will either be similar or less than current Richland parts...I expect unlocked APUs to be 100W TDP and the locked APUs to be 65W or less. Power consumption is actually going to be less with Kaveri.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yes, Crytek released a patch that optimizes the game for the 3770k whereas decreases the performance of the AMD chips (I wonder why), but the old and cheap 8350 continues being faster than both the i5-3570k and the new Haswell i5-4670k.

Besides that and that the patch gives lots of problems to lots of users (Crytek forums are full of angry users), the eurogammer recomendation is the same the FX-8350 is preferred over the i5 for future gaming.





Let us see why people says hasfail, hasbeen, failwell...

Promised 15% increase in performance? Fail. Benchmarked 5% average. Regressions in some tests. Funny how one of you gave a Crysis 3 benchmark where the failwell i5 is slower than the ivy Bridge i5.

Promised graphics performance of 650M level? Fail. Those were with cheated synthetics. In real games it perform poor. Moreover, performance drops badly with increased resolution and the chip is power hungry and expensive. Most OEMs are rejecting Haswell GT3e.

Desktop Haswell consumes more power than Ivy. Efficiency is poor. Fail

Haswell OC poor than Ivy. Fail

Haswell runs hotter than Ivy. TDPs are higher as well. Fail

Tomshardware doesn't recommend any Haswell chip in the updated gaming hierarchy. And a clearly pro-Intel site as Anandtech said in a recent review that Haswell desktops "are a joke".
 
Status
Not open for further replies.