Intel Quadcore Vs. AMD Octacore - Gaming and future octacore-optimized development.

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

prankstare

Distinguished
Mar 29, 2010
50
0
18,630
Hey,

So we all know Intel's architecture is much better and energy/performance efficient per thread/core but how about multi-tasking performance? Also, do you think that, in the near future perhaps, not only games but also most computer programs will all benefit from using 8 actual cores like next-gen consoles are doing for games?

The reason I'm asking this is because I'm a bit torn between buying "faster" but expensive Intel's quadcore solution i5 3570k or "slower" but much cheaper AMD's octacore FX-8350. However, if the future say 8-12 months from now will be eight-core optimized sofware all the way (including games and overall multi-tasking), then I think such "slower" (for now) AMD solution is worth it.

So, any ideas?

Thanks!
 
So much for a good morning....

Ahh...well...I didn't bother to check the link...

I think this is a huge problem. If someone is going to go through the hassle of finding some info, at least look at it.

What you forgot to tell him is the FX 8350 cpu it's self don't support PCIe 3.0

That doesn't matter...the L2 cache used to be on chipset on the MB, but it's on die now...there's no difference.

Actually, I have told him that. And what does the L2 Cache have to do with anything? The L2 Cache has been on die since like "forever' now. It used to be attached on the back side bus, but I think even most P1s had it on die. All p2's for sure. In any case I have no idea why we are talking about L2 Cache when talking about PCIe 3.0 support.

Speaking of not working links, none of those french site links works for me.

Finally, I'll spend the next few minutes deleting posts. You guys need to keep it civil. Though I suspect some of you are purposely trying to get this thread closed. Remember, he who gets the thread closed also gets a vacation from posting for awhile.
 


I remember you correcting 8350rocks on this.
 


The CPU doesn't have to support PCIe 3.0. What they did was put bridge chip on the MB to fuse 2 PCIe 2.0 lanes into a PCIe 3.0 lane.

You can use PCIe 3.0 cards at full bandwidth with that MB.

 


+1 PCIe 3.0 doesn't have to be supported by the chip. It makes no difference whether it's integrated into the board or on the die, the effect is the same. It's not like the CPU controls whether or not the MB allows the bandwidth.
 

Thx for the info.
 


I stand corrected.
 
There's a misconception about the FX 8350 and Phenom II X6 1100T beating
the 2600k with ease.
http://www.tomshardware.com/charts/cpu-charts-2012/compare,3142.html?prod%5B5754%5D=on&prod%5B5759%5D=on&prod%5B5792%5D=on&prod%5B5877%5D=on&prod%5B5945%5D=on
In Tom's benchmarks the 2600k is still potent in every way,
with that being said I feel the FX 8350 is no better then a 2600k, and also
included the FX 8300 it's just 100Mhz faster and lose on 90% of the benchmarks,
I used that chip to show when running at the same or close the 2600k still rule,
and we here in this forum know the 2600k can overclock well on air and water.
 


Like I said earlier, for what I use it for...the AMDs are better. Video encoding/rendering/compressing files...in this regard the FX 8350 is better than the 3570k. It's nothing personal, Intel has strengths, but so does AMD. There is no misconception.
 

I will give you this out of the box for what you use them for the FX 8350 is a little better,
but running them at the same speeds it takes a good beating.

Now you also said your Phenom II X6 could beat the 2600k as well now that my
friend is a complete falsehood.

the 3570k isn't better then the 2600k either
http://www.tomshardware.com/charts/cpu-charts-2012/compare,3142.html?prod%5B5755%5D=on&prod%5B5759%5D=on&prod%5B5792%5D=on&prod%5B5877%5D=on&prod%5B5945%5D=on
 


So you run 3DSmax, PCMark 7, Blender, and AI Chess at all times whilst extracting hundreds of gb's worth of zipped files? What the hell kind of operation are you running?
 


I don't run them concurrently...but I am a game developer, so compressing rendered files for storage and rendering images are part of my daily routine. Also, I deal quite a bit with game architecture...so AI in games is something I deal with pretty frequently as well.

I don't consider my workload on my PC "typical"...most people don't use that kind of variety of programs.

EDIT: I have seen the new roadmap...I am not sure comparing 4th Gen E series to regular FX is apples to apples either, though if steamroller held it's own in that comparison I would be pleasantly surprised. Now a 16C/16T Opteron 63XX...that would be a good match for the 4th Gen E series...(since they're basically converted Xeons anyway).
 


All i'm trying to show you is the 2600k is on par with the FX 8350
in everything you do, it's not laughable or can be beaten by a mile,
and will destroy your AMD Phenom II X6 1100T BE in everything you
use your pc for, and at the same clock speeds the FX loose at everything!

Stop comparing the FX 8350 to the i7 3770k when you on par with
the 2600k.

I'm done thanks for playing.
 
Well actually when the 2600k was released AMD didn't have the Fx 8350,
and I paid 319.00 for it, with that being said I don't game on my machine, I make money
with them, so over 2 year's I've been enjoying the performance while making money.
Oh 319.00 for the best performance when released vs 195.00 for performance of a
2 year old cpu today.


 
The point is people like yourself talking about how great this FX 8350 while
trying to dismiss the 2600k which has that performance 2 year's ago.
So how great is something that follows 2 year's later and with nothing
extra.

 
The arguments you are giving are so poor, why would you want to give money to a company that is purposefully not trying?

Why does it matter how many cores they have if the performance is the same, with more cores in the new consoles then surely it would be stupid to buy intel if they have equal performance now and amd chip might get more powerful with the new consoles (and it will be £100+ cheaper)

I still have not seen one proper counter argument to the fact that if you get an 3570k now, you will end up spending more than £100 more when you upgrade in the future.
 




Ah, you're pretty sharp. Always good to see some that aren't "sheep to the slaughter".
 
Status
Not open for further replies.