AMD CPU speculation... and expert conjecture

Page 238 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


AMD is not limited by power consumption, they give a hornet's @$$ about power consumption on the desktop. You do realize the 4770K\2600K\3770K will take many years to even pay itself off between a 8350 or even a secondhand, overclocked X58 system in 2011 (Oh, and the 960s and even 970s were @2600K prices on reputable ebay dealers, not to say that the 2600K was not an improvement @all), and by then, it will be obsolete... It was more like going from a 660ti to a GTX 760 TBH.
 

8350rocks

Distinguished


I am talking about core utilization here...not FPS numbers.

Apples, meet Oranges...we're all going to be friends.

Listen, the discussion was not about frame rates, it was about bottlenecks. You seem to gloss over that part and go straight to benchmarks, which are really only indicative of the content of the benchmark, not a player experience.

Further...you glossed over this:

LL


LL


As I said before...the i3 is a bottleneck.

It is not disputable...and it is being propped up entirely by the GPU used.

Want to see a fair comparison? Let's look at 1024x768 low with a HD 7770 and see how many laps around the i3 the 8350 runs...care to engage in that test? Also, you chose 1024x768 benchmarks because they illustrate a tangentially relevant scenario that no one playing Crysis 3 would observe in reality. The 1080p benchmarks show just how graphically demanding the game is, which illustrates my main point that the CPU is not the only part that Crysis 3 is demanding on.

LL


LL


LL


Those benchmarks show how ridiculously demanding the game is...
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


have anything other than PEW, PEW, PEW to brag about?

I found an energy efficient weapon for you to use next time.

pew-pew-pew-02.jpg
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Microsoft is saying they want the XBone to last 10 years. I sure as hell hope not.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Propaganda? I'm talking about history and what led to AMD's current situation in the server space, and how that's shaped what they're doing now.

Cray was a specific example, not a generalization. History repeats itself when it's forgotten and people make the same mistakes again. If you want to ignore it that's fine. There's no need for further discussion.


 

8350rocks

Distinguished


You are so far off base...without a GTX 680, that i3-3220 wouldn't be able to muster playable frame rates.

Also, when we're talking about frame rates, a 30% boost is astronomical. Anything over 10-15% is HUGE when you're talking about a single component improving performance.

Intel hasn't produced a 5.0 GHz CPU because they would fry the chip trying to get there and couldn't get consistent yields to get clockspeed that high consistently. Hence why their clock speeds remain so low. If they advertised a CPU at 4.2 GHz and some of them had to run at 1.4 vcore to get there...(like some hasfail samples)...then their performance per watt would be in the toilet and you'd have nothing to brag about.

Get off your Intel soapbox...

EDIT: 2600k = 89.25% Core resources used; FX 6300 = 86% Core resources used
 
 

8350rocks

Distinguished


No, it's 89.25% as I pointed out earlier. If you take away the HTT then it would still be 89.25% core usage...as HTT uses up core resources.

Let me put it differently: Core load % = CPU load % broken down by core

Understand?

From this point forward, if you're going to try to spout numbers, then you need to show information supporting your facts.

You have had no data for this entire page. The conversation is done until you can provide any credible shred of proof that anything you have said at all is true about core usage on Crysis 3. You're just guessing.

Also, a lesser GPU with the 8350 would blow the doors off any other CPU besides the 3960x in that comparison. Do you know why? (No, of course you don't because you're arguing something you cannot possibly win) The answer is: because the 8350 has 25% headroom for core usage, the only CPU in the test to exceed that room to spare is the 3960x.

This means directly and undisputably:

8350 > 2600k
8350 > 3220

EDIT: Interestingly enough, the 4 core FX 4300 used in the test is about 94-95% Core usage across the board, which is about 5% off what the Intel quad core is.

FYI: The FX line is already on par with SB in many applications, and better at many others. So you trying to say the 2600 would destroy the 8350 is like trying to tell someone the sky is falling. They look at you and laugh, and then realize what a fool you are for believing that to be the truth.

You've been essentially comic relief for the people that actually know about hardware...except we now need relief from your comic relief because you've become irritating and nonsensical to the point it borders insanity.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790



In the message posted just about yours, I already explained that increasing number of cores increase performance if the cores are utilized by the software.

I also mentioned bottlenecks. There are many "if". For instance, if the task needs 8GB and the quad has only 2GB of slow memory, whereas the dual chip has 12 GB of fast memory, the chance is that the dual will be faster. We can play to put so many "if" as you want, but the main point is that performance is gained, in general, by increasing IPC, freq, and number of cores. Any chip designer uses the three.

I don't believe that software has reached any limit, at contrary I see many room for improvement. Current programs are terribly unoptimized for the existent hardware. The other day a poster explained me how he was obtaining a 10--100% increase in the performance of programs by recompiling them for his FX chip. This is up to a 2x on the same hardware. And this was only with compiler automatic optimization.

Your remarks about GPU don't change anything of that I am saying here, because I referred to cores, in abstract sense, without mentioning what kind of cores. What AMD is doing is substituting cores of one kind by cores of another kind, but still it is the "moar cores" paradigm. Look at the server space, 8 PD cores are being substituted by 4 SR cores plus a number of GPU cores, because the GPU cores are more appropiated for some workloads.

I don't know what you mean by "console hype", but my claims are backed up on technical details of the new hardware in the consoles. It is not correct to say that next consoles are as powerful as "mid-range gaming PC's". Many experts have noted how next consoles are far away from current PCs. Lottes has a particularly detailed analysis.

Also I don't know why you think that better implementation for bdver2 will affect the performance of Intel chips.






The "troll post posting nonsensical information" is again only in your head. Data, which has been posted here before, shows that a FX six core is not fully loaded by the engine, whereas an i3 dual core is loaded up to a 99.5% (bottleneck). Also your intel run circles motto is another of your typical trolling stuff. The same data shows how the FX-8350 is faster than the i7, whereas the FX-6300 is just behind the i7, achieving a 93% of the i7 performance.
 


Take a look at the difference between the i3 and the i5. Hell, take a look at the Pentium and the i5.

Also, to really test how well threaded the game is for CPUs, you need to pin point the CPU-dependent parts in the engine and make the test. Lower resolution to the lowest possible and disable all GPU-dependent eye-candy. That's the only way you will really know. The second graph, the one at 1024x768 is really weird to me. Looks like when they lowered the resolution, they also lowered the CPU bottleneck somehow. It would make sense to me, since they lowered the amount of data being thrown to the CPU to be munched.

Anyway, I side with palladin on the threading issue. If you guys have never ever approached real-world and very difficult problems with a threaded perspective/analysis, then you guys won't be able to really understand what he's talking about (palladin). And just for kicks, threading is not about just spawning new threads that do stuff, but also re-imagine the software/hardware usage as a whole. Different systems have different bottlenecks to account for on big enterprise solutions, so it's a big head cracker each time you need a time critical component. As a side note, Backbones and ETL servers are the most fun to work with. ETL solutions, specially, can scale almost indefinitely, so even with lazy thinking you can thread efficiently.

Now, for Desktop software, the threading/parallel things is very different to take on. First of all, in the enterprise world hardware is at the mercy (most of the time) of the solution you usually choose, but in desktop you don't have that luxury. You need to create your solution to the lower denominator all the time and scale up from that. This is about looking the glass half empty or half full. I really don't know if Intel has hit some sort of IPC ceiling, but I doubt it and they still will be releasing more core CPUs. AMD has been ahead of that game for a while now (Athlon64 X2 days) and Intel following behind. Problem for AMD is that Intel drives the industry for the desktop market, not them. The enterprise world is very different; there are a lot of heavy hitters that have YEARS and YEARS of threading experience. IBM lives in their own island next to NEC and VIA, where Oracle/Sun have their own private island as well. All of them love throwing cores to their problems, because all of their solutions can make use of extra cores (and they actually charge you... bastards). So, in Desktop, we'll start seeing a shift to 4+ cores when Intel decides is the time for it, not because there's already hardware for it.

Does Intel or AMD sell single core chips for desktop?

Cheers!
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460


There is more BS in that statement than the XGames LA best whip rider line up. LOL! (if you watch extreme sporting events, you'll get it)
 

8350rocks

Distinguished


:rofl:
 

8350rocks

Distinguished


+1 Well said man...I couldn't have put it better.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


This kind of trolling nonsense was corrected before.



"The roof of all evil" is not so well multithreaded as "Welcome of the jungle". Look to the six-core performing exactly as the eight-core. Why? because that section of the game is ignoring the extra cores...



People knows very well that an i3 will beat an fx-8350 with games that ignore about a 50--75% of the FX-chip. Intel trolls still believe that the merit is of the hardware alone, LOL.

Trolls don't know this, but all triple-A game developers selected the fx-8350 as the best cpu for future games. Of course, those future games will be using all the cores.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I would add more: What kind of Haswell processors has Intel introduced in the market? Dual core i3? Or quad cores i5 and i7?

Answer: 7i dna 5i seroc dauQ
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
4ghz i3? Intel needs this? Do you think Intel would enjoy canibalizing their entire market of $300 CPU quad cores?

Intel abolished overclocking on lower end chips like i3s to force people to buy K Quads and it worked. Not only did it work but Intel managed to completely screw over people and people have dranken so much kool aide that they'll actively defend a company that's screwed them over and continues to do so. And the icing on the cake is that these people will usually defend Intel over things that are completely irrelevant to a desktop users' needs (GT3 performance, power consumption, IPC without accounting for clockrate, heat output, etc). I wonder how much these lemmings are going to defend Intel in the future. It is obvious that the power saving focus Intel is having is hurting overall potential clock speeds of their chips. It's completely evident in the overclocking headroom we've seen going from SB to Haswell. It's only going to get worse and I wouldn't even be surprised if Broadwell on desktop was cancelled because it offers a 3% IPC increase and 14nm can no longer hit the clocks of 3.5ghz base and 3.9ghz turbo. Meaning Broadwell on desktop would actually be a performance regression over previous generation, all for the sake of lower power consumption. A goal best served by a RISC architecture as opposed to a CISC one.

If AMD did the same thing all these people buying $90 Phenom 2s would be buying FX 8350 X UNLOCKED ONLY CPU YOU CAN OVERCLOCK IN THEIR LINEUP EDITION. The internet would be mad as hell.

But Intel can do this and they get a free pass because they are Intel and their products are faster in the benchmarks review sites choose and the benchmarks that Intel either has their own tools being used to compile things or were optimized for Intel or whatever.

I can't wait to watch all the lemmings like hajifur march off the cliff they're all heading towards. It's rare to see people so vigorously defend something that is actively harming their interests and if you're into sociology and psychology the current state of Intel fan boys is a very special treat.

Yes, an unlocked Intel dual core would be a great chip. And no, you'll never have one when you could have had one before (remember overclocking Core 2 Duos when Core 2 Quads existed?) because Intel doesn't care about you. They care about mobile. So much so in fact that they're willing to make architectural and fabrication changes that would prevent a new microarchitecture from being released on a desktop because it wouldn't be worth it there.

I know you Intel failboys like hajifur are going to ignore this but I hope the rest of you pick up on this and take this ammo I'm giving you. I'm tired of these mongrels swarming on these types of threads when they're defending a company that's actively harming my interests (affordable x86 performance in the form of a combination of clock speed and IPC is top priority and power consumption and heat mean nothing) and yammering on incesantly over things I don't care about at all.

You guys who are obsessed with your power consumption have chips provided by Intel that serve your purpose. Hating something else because it doesn't serve your purpose and then basically saying that everything that doesn't suit your purpose is crap is so idiotic I don't even know. It's great that Intel is making low power chips if you want low power devices, but don't cheerlead for them and then defend them constantly when you're not respecting what others want out of their computers.
 


+1. Speaking of which, I still remember overclocking from the Pentium 4 and Athlon 64 days, to the end of the real overclocking era (for Intel) with the i5\i7 750\760\8XX and i7 920-950 overclocking to ludicrously high speeds for the architecture before they (Intel) killed it off for anything except the top of the ladder "K" i5 in SB.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
Today, after scrolling through some youtube videos, I came across people saying the Haswell is a hot chip. Then I hear people say the FX 8350 is a hot chip. Then I turned on my Intel Celeron Craptop for old times sake. And you know what, it was HOT!

Conclusion: THEY'RE ALL HOT CHIPS! (Changing the world as we know it with one revolutionary break through)
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810
I know that in Intel Land (a parallel dimension of Mario World) i3 3220s are 8 years ahead of FX-8350. The problem is we all live in the real world. In the real world games that made the i3 look good are done. Going forward games are going to be more and more like Crysis 3, and take better advantage of more cores than Crysis 3 did. I think we tend to forget that Crysis 3 is a blueprint of sorts, a first generation game that utilizes many cores. The next generations of those games will utilize the cores more effectively making six and eight core cpus more desirable. If not Intel wouldn't be thinking about releasing an 8 core Haswell (which to the best of my knowledge hasn't happened yet so there is no reason to tout about how great they are until they prove themselves- lord knows 4 core Haswell has had enough troubles 8 core may not release for quite some time).

Everyone knows that Intel has an advantage over AMD right now, but come on a 6 year lead? At least 10 different times hafijur has claimed that Intel is 6 years ahead of AMD in performance. I drink beer and read hafijur's posts and they make no sense, I switch to grain alcohol and just as it seems to become clear... its the next morning the hangover is passing and nothing he's posting makes sense anymore. Seriously I go back and forth wondering if your really naive enough to believe what your posting or if your paid by Intel to post this crap on AMD threads. I've seen a lot of posts from "different" Intel fan boys on different AMD threads and they all have the same exact broken record crap your posting all over here. Either you read all their crap and are re-spreading it or you go by many user names to spread Intel's propaganda.
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


Haswell chips do run hot, especially when you try to overclock them. The Intel fan club likes to try to dismiss or overlook Haswell's horrible overclocking problems, but they use much more power and run far too hot when overclocked. The 4 core Haswell chips can't handle the vcore load of overclocking (they run far too hot) I have serious doubts about when there will be an 8 core Haswell chip, or what they will have to clock them at to keep the heat under control.
 
Toms is becoming like Anandtech and the other retarded online sites with inuendo and butthurtness, find myself stupified by comments, notably those on the FX9590 and just how little people know or even think before posting just to get out the Intel is more efficient rhetoric that is about as old as time now.


I may need to get banned because someone is going to tip this boat a little to much with utter rubbish that I don't have the patience to deal with.
 
Status
Not open for further replies.