GOM3RPLY3R
Honorable
8350rocks :
GOM3RPLY3R :
juanrga, I have to say, you are really taking this to another level. I can really tell you are such a Fan-Boy.
1. The World Record for clock speed doesn't matter. And yes it can withstand higher temps BECAUSE of the build quality. Your right on that, but its more because of a build quality more focused on heat rather than performance. And yeah we got a higher Clock Speed, but its the code that matters, and I think your disregarding that fact.
1. The World Record for clock speed doesn't matter. And yes it can withstand higher temps BECAUSE of the build quality. Your right on that, but its more because of a build quality more focused on heat rather than performance. And yeah we got a higher Clock Speed, but its the code that matters, and I think your disregarding that fact.
Oh, but it DOES matter, see...Intel uses a triple gate on bulk process to try to get the most from the bulk wafer. Bulk is the cheapest wafer you can buy, it's the lowest quality bin, and intel uses TriGate to try to squeeze the most out of it. Funny they charge the most for the cheapest silicon huh? AMD uses bulk for non important things that don't require high performance...otherwise they use a SOI, Silicon On Insulator, which means that the silicon has an extra component in it to keep it insulated and heat resistant, making it perform better...for longer.
2. About Aero, yeah its disabled in Full Screen, but what about windowed? I know everyone doesn't just run in windowed, however, for all the games I play, It usually increases the FPS from about 10 to anywhere up to 80. And yeah that "wont matter" when I get my ass-kicking PC, but its just natural for me to play in that, I get annoyed in Full Screen. So your fact about Windows Aero isn't fully accurate. And what Dumb-Ass would get Windows 8 for a Gaming PC? It's terrible for it. With the Intel being bad for most gaming titles, you are right, however, when Intel runs its Graphics with GPU in, it tries to focus on anything that isnt being focused on by the main GPU. So i.e all the background processes. This goes back to me with windowed. Sometimes, I may run on one screen and want to check Facebook or a Map or something on another. It really comes in handy then.
Intel does not currently support on board graphics plus discrete GPU, it causes a plethora of issues with the hardware, and they have outright stated that onboard graphics should be disabled if you're using a discrete GPU. You've received bad information somewhere.
3. Percentages, again, don't really matter. It's again with the code. I can run Super Pi and force it to go on all Cores. Then in a comparison between the two that both are to run at the same (4.0 for example), an i7-3770k still can get about the same results as the 8 Core FX Chip, simply because of the way its Coded and Physical Build.
The physical build is inferior. PERIOD. Anyone who knows ANYTHING about composition of materials will not argue that intel has a superior quality wafer...it's the most easily disproved claim you make. You keep talking about "how it's coded"...you do realize coding is not a part of a CPU right? It's programming language. If you're talking about protocols...then intel is designed for single threaded applications, I have reviewed this multiple times in this thread alone. Also, SuperPi is a single core benchmark...it cannot be "forced" onto more cores...it is designed specifically to test single core performance. Bad information. The only reason the i7-3770k competes with the FX8350 in many categories is frankly, because it is that good at single threaded apps, and this allows it to overcompensate in highly threaded apps.
4. What apps run 8 cores? Almost NONE. The extra cores at this point in time do not matter what soever, unless you use it to run cool bench marking and trying to do cool 8000 step math equations. With Hyper-Threading, you made the point that activated it can decrease performance, and stating that the 8 Separate Core Combo is better. I have to say, 1. You can just disable it. 2. Some applications actually work better with Intel HT on 4 cores than separate AMD 8 cores. And 3. Yeah It'll be faster with apps that run all eight, but unless your an "Extreme Computer Scientist," I don't see that advantage with my ArmA Game.
Actually, games like Crysis 3 that "support" HT actually run better without it on...google it and look at the youtube videos...the facts are there. HT is a way to rook people out of more money for basically a software trying to do the work of a core in a background operation...that all the while robs the hardware of performance on the foreground operation. HyperThreading is an industry wide inside joke...Intel has nearly admitted as much openly.
5. Back to that Physical Build stuff. Yeah it can run hotter without problems, but think about the process of the workload. Your AMD will probably run at 4.5 Ghz, and mine will run the same. And yeah you have more Cores, but the problem with that is, you have more of a heat problem. So AMD will use less expensive and more heat resistant materials that won't come close to the performance if you had regular materials. And thus you need the "Superior 8 Core Power" so you can get a similar performance as the Intel Processors.
TDP and core voltage have a direct correlation to heat. At no point does the number of cores come into play. PERIOD.
Again, the BS about materials...look man...I posted a link to wikipedia that explained the difference between SOI and bulk for you...and you still sit here and try to tell me, wrongly, that I am wrong and you are right. Show me one shred of evidence that says bulk wafers are better than SOI, or that intel uses anything other than bulk wafers. You can't find it...you know why? Because it doesn't exist.
6. With your usage question, I can say that. AMD's CPUs and GPUs are more suited towards each other so the workflow is more committed an can be processed in a 'better' manner. And it would be more fair in a testing environment. It would be most fair to run AMD with AMD, AMD GPU with Intel, Intel with Nvidia, and AMD processor with Nvidia. And yeah their more optimized, but there not made for each other (AMD and AMD), and that's where the AMD and AMD vs. Intel and Nvidia Combo comes into play.
Putting this with what ericjohn said, what do you have to say with your AMD fans?
Just because intel is more optimized for single threaded apps does not mean they will be better at anything multi-threaded now or in the near future. You need to come up with some facts to support your statements.
ericjohn004 :
The reason why low power is so important is because of heat, noise, and obviously electricity use. You can build an HTPC with a 35w intel in it and it'd be completely silent without needing a fan. It'd be a lot harder to do the same thing with an AMD as I've read in Tom's hardware's 0 db PC build. It's possible, but just a lot easier to do it with an intel, and cooler too.
I agree that with a 3570k or 8350, power consumption doesn't matter TOO much because if you get either one of those processors, your going to overclock and give that advantage away and the difference amount to a cup of coffee once a month. However, power consumption does give us an idea about a chips efficiency or the amount of power you have per watt. And this is where the big advantage lies. It says a lot about an Intel chip that Intel can do so much more, with so much less power. How fast would an Intel chip be if it could use well over 100w? Very fast. But that's not what they're trying to do, it is what AMD is trying to do though, and they still can't get up to par with an Intel even with power consumption out of the picture.
I agree with the HD4000 graphics, they hardly benefit at all, if any. They're a waste, and Intel should offer the 3570k and 3770k with out the graphics and for 50$ cheaper. If they would, they'd never sell another 3570k with an iGPU again but they would sell more of them. Although they do make a 3350p with the iGPU disabled although I still think it's on the chip, just disabled.
If anyone still thinks that an 8350 is completely equal to an Intel 3570k or 3770k in gaming, just take a look at Tom's new article that came out today. Comparing Nvidia and AMD cards in SLI, using an 8350 and 3770k. Clearly Nvidia and AMD cards run better on an Intel. Especially AMD's cards, which is surprising. But Nvidia cards run better on an 8350 because they require less power from the CPU. Very interesting indeed. But you can clearly see the quality difference in between the two chips. Is it worth the extra 20 bucks for the 3570k? IMO, yes it is. I know the article had a 3770k but their is really no difference between them in the games they tested.
Read the article, if you want 10% more performance in pretty much every game tested, then you'll want the best. If you could care less about 10FPS out of 100FPS, then you'll be rewarded with a better price.
And superPi is a very important benchmark, as it demonstrates single threaded capability, which is a very large slice of the pie. The reason why this benchmarks is important is because it is indicative of the performance you can expect to get out of programs like lame, and iTunes, single threaded games, and so many other pieces of software and also much of Windows 7 uses a single core. To say single threaded programs are being phased out is true, but to say a single cores performance isn't important at all is completely untrue. It's still very important, and it will remain this way for years, although it is being phased out it will always be important to some extent.
***And that last footnote that you made, since AMD makes both Intel and Nivida rely on each other. Its no different than the AMD CPU and GPU relying on each other. You basically just answered your own question.***
I agree that with a 3570k or 8350, power consumption doesn't matter TOO much because if you get either one of those processors, your going to overclock and give that advantage away and the difference amount to a cup of coffee once a month. However, power consumption does give us an idea about a chips efficiency or the amount of power you have per watt. And this is where the big advantage lies. It says a lot about an Intel chip that Intel can do so much more, with so much less power. How fast would an Intel chip be if it could use well over 100w? Very fast. But that's not what they're trying to do, it is what AMD is trying to do though, and they still can't get up to par with an Intel even with power consumption out of the picture.
I agree with the HD4000 graphics, they hardly benefit at all, if any. They're a waste, and Intel should offer the 3570k and 3770k with out the graphics and for 50$ cheaper. If they would, they'd never sell another 3570k with an iGPU again but they would sell more of them. Although they do make a 3350p with the iGPU disabled although I still think it's on the chip, just disabled.
If anyone still thinks that an 8350 is completely equal to an Intel 3570k or 3770k in gaming, just take a look at Tom's new article that came out today. Comparing Nvidia and AMD cards in SLI, using an 8350 and 3770k. Clearly Nvidia and AMD cards run better on an Intel. Especially AMD's cards, which is surprising. But Nvidia cards run better on an 8350 because they require less power from the CPU. Very interesting indeed. But you can clearly see the quality difference in between the two chips. Is it worth the extra 20 bucks for the 3570k? IMO, yes it is. I know the article had a 3770k but their is really no difference between them in the games they tested.
Read the article, if you want 10% more performance in pretty much every game tested, then you'll want the best. If you could care less about 10FPS out of 100FPS, then you'll be rewarded with a better price.
And superPi is a very important benchmark, as it demonstrates single threaded capability, which is a very large slice of the pie. The reason why this benchmarks is important is because it is indicative of the performance you can expect to get out of programs like lame, and iTunes, single threaded games, and so many other pieces of software and also much of Windows 7 uses a single core. To say single threaded programs are being phased out is true, but to say a single cores performance isn't important at all is completely untrue. It's still very important, and it will remain this way for years, although it is being phased out it will always be important to some extent.
***And that last footnote that you made, since AMD makes both Intel and Nivida rely on each other. Its no different than the AMD CPU and GPU relying on each other. You basically just answered your own question.***
Already addressed this earlier.
8350rocks :
GOM3RPLY3R :
SamGriffiths :
https://www.youtube.com/watch?v=rIVGwj1_Qno
https://www.youtube.com/watch?v=4et7kDGSRfc
https://www.youtube.com/watch?v=eu8Sekdb-IE
I'm just going to say that its sad that almost all of the results were done with AMD GPUs. That's not very fair. None of these links show constant results with all the same GPUs. However though I did find, that from all of the videos combined together (as a reference average), Intel did win overall.
Considering Tom's Hardware just published an article showing that AMD GPUs perform better with intel CPUs and Nvidia GPUs perform better with AMD CPUs...it doesn't surprise me that Intel won slightly. If they had used Nvidia GPUs AMD would have won outright without any doubt...LOL.
But when they did the 8350 vs the 3570K on the GTX 670, Intel still won... even a little better than the 8350 vs 3770k on the 7970...