AMD Phenom II X6 1100T Review: The New Six-Core Flagship

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

killerclick

Distinguished
Jan 13, 2010
1,563
0
19,790
I have four Athlon II X3 machines that I'm using for some light network rendering and encoding. I'm going to upgrade them to Phenom II X6 and get twice the computing power (24 cores) for $800. All I have to do is buy the CPUs and pop them in, same motherboards, same DDR2, not even a BIOS update is needed. As for new systems, I'd probably use an i7 but I'm not building a new system now so AMD gets my $800.
 
[citation][nom]speakmymind[/nom]Intel is far from the giant from 2007, with massive bribing of retailers. They lost $1.25B to AMD for the "sell only INTEL, NO AMD, and get free INTEL processors as rebates".Today, Intel locks NVidia/VIA/SIS from the LGA1156 & LGA1366 platforms to protect their own monopoly. #$%^^&(^&@#!!Many motherboards companies supported INTEL for the past 10 years. Today, Intel is killing them one by one. (YES, ABIT is dead)[/citation]

VIA and SIS have not made chipsets for a long time. The last SIS or VIA chipset I remember was for the Pentium 4. When AMD bought ATI they had a chipset and it was only AMD/nVidia for AMD CPUs.

And to be honest, I wont miss nVidia for chipsets. The only thing they had over Intel or ATIs chipsets was SLI but even then it almost wasn't worth the lower stability, lower OCing capabilities or lower performance. In chipsets, Intel makes the best.

Most of the mobo makers who are going out are going out because they don't have any OEM contracts. Most OEMs use Asus or Gigabyte low low end mobos. ABit probably didn't have that kind of a contract and without those you tend not to do very well. Besides, ABit wasn't one of the better ones. They were actually pretty low end. Asus and Gigabyte tend to be the best boards you can get.

As for this CPU, its just a 1090t clocked higher stock. Nothing majorly impressive. I don't think AMD will have anything too impressive until BD hits. And thats is only IF their version of SMT works well enough to at least give it 90% or better of Sandy Bridge performance.

Of course AMD would prefer to beat Sandy Bridge so they can finally price their CPUs like they did with Athlon 64 and make a decent profit. But I guess we will see when it hits. We know SB should give at least 20% better perfromance overall (more in some areas, less in others) than Nehalem. That means BD will have to hit 20% alone to beat Nehalem and 30% better to match SB.

We shall see.
 
[citation][nom]KT_WASP[/nom]This article is questionable, and I'll tell you why.Look at the game benchmarks, 1280x1024 resolution? Why would the author of this article use such low settings? I'll tell you why, because at that setting, the included Intel CPUs look better. Every other review site used higher resolutions and the results come out very different. Check for yourself and you will see.For example, look over on guru3d.. it shows the i7-980X vs the PIIx6 1100T in FarCry 2. They picked that game because you would see the differences of CPU more as modern GPUs can handle that game with no problems. at 1280x1024 the i7-980 decimates the AMD counterpart. But, get past 1600x1200 and all of the sudden the 1100T is neck and neck.. get upto 1920x1080 and the PIIx6 surpasses the i7-980X.Go look at all the sites.. really do... and you'll see, that once the resolutions go up, the field levels out dramatically.Who games at 1280x1024? you? I didn't think so... Misleading game benches just to make the i7-920 look better is pretty bunk IMO.[/citation]

In order to properly see if a CPU will bottleneck your gaming rig you have to run at a low resolution. The lower resolutions in games tend be handled by the CPU more than the GPU. The higher a FPS on a lower resolution from a CPU, the less chance that the CPU will be bottlenecking the GPU.

When the game shows worse FPS on a higher resolution it normally means the GPU is bottlenecking the CPU.

They did this back when the Athlon 64 was the better CPU (probably 800x600 or lower res) and they will continue doing it.

In terms of multi-GPU setups it has been shown what the LGA1366 Core i7 series has less of a chance of bottlenecking a game when running really high resolutions, probably due to its massive bandwidth from the QPI links.
 

f-14

Distinguished
Would we recommend an upgrade today, though, knowing that Sandy Bridge is a couple of weeks away, and the first Brazos-based CPUs are going to be unveiled at CES? If you can, it certainly seems like a better idea to wait.

yes wait for sandy bridge to come along and if it drops the prices on the i7's and i5's to something respectable like AMD's prices or if they still stay the same ludicrious prices. sandy bridge is starting to sounds like more hype then performance benfit. amd's architecture needs an imporvement, they are probably just waiting for intels sandy bridge to do it so they don't get to take a price hit, but deliver intel one. amd is far overdue for some kind of major improvement and i would not be surprised if it happens in feburary to april. it would explain this cat and mouse game they are playing on intel with prices and refinements/speed bumps.
i'm hoping it is a die shrink to 32 or maybe 20nm?! less power consumption would really help them with commercial sales, but i also would not be surprised if amd puts 2 GPU's in on the next hexacore and or quad core chips with a way to possibly elminate the need for pci express for gpu's until the on-die gpus are very out dated or all together as a new cpu/gpu chip would be an easier upgrade path and possibly cheaper compared to intels prices as you'd be able to upgrade to a new chip 2-3-4 years later all for less then the price of intels chip at the time of the original purchase. yes it would make things worse in some regard, but it would also make things simpler and hopefully also be a huge speed increase, which is much needed if we want our new games to be as sweet as crysis/2 with directX11.amd needs to start profiting from that horrible ATI deal still or else it means they gave up being a cpu company to become a gpu company imo. it would also give more time for software developers to get with the program and adopt and incorporate multi-threading as putting out more cores isn't doing much if hardly anybody is taking advantage of them grrr. whether AMD knows or likes it or not there is going to be huge pressure for them to improve this spring if they want to keep sales.
 

f-14

Distinguished
[citation][nom]masterasia[/nom]Sorry AMD fanboys...not good enough. Took AMD 2 years to be almost where Intel was 2 years ago with the i7 920 (which is still top dawg and the best bang for your buck).[/citation]
news flash I7-950's are best bang for your buck since the last system builders marathon when prices hit $299 at both best buy and microcenter
 

Travis Beane

Distinguished
Aug 6, 2010
470
0
18,780
[citation][nom]Mark Heath[/nom]I wish Intel would do something like this for all (or at least most) of their processors.(the speed bumps with same price model)[/citation]
i7-920 for $300, then i7-930 for $300, now i7-950 for $300. Isn't it the same thing?
These are nice chips, and pretty cheap too. I wouldn't trade my i7-920 for it though. 3.675 GHz (175*21) at stock voltage is just too nice. The chip is a year and a half old now, and there still is no suitable replacement for under $1000.

Waiting to see the results of a head on collision between Bulldozer and Sandy Bridge though.
 

coldmast

Distinguished
May 8, 2007
664
0
18,980
[citation][nom]Mark Heath[/nom]I wish Intel would do something like this for all (or at least most) of their processors.(the speed bumps with same price model)[/citation]
i5 760?
 

saikyan

Distinguished
Feb 1, 2010
22
0
18,510
[citation][nom]f-14[/nom]news flash I7-950's are best bang for your buck since the last system builders marathon when prices hit $299 at both best buy and microcenter[/citation]

i7-950 is at Microcenter for $200 now and the i5-760 for $170.

I see absolutely no compelling reason to go AMD. Inferior performance for negligable savings.
 

luke904

Distinguished
Jun 15, 2009
142
0
18,690
[citation][nom]saint19[/nom]AMD is now delivery a lot of performance with their products, maybe not at the same Intel level, but that can be enough for some people that prefer price/performance and doesn't have enough money to go with Intel's CPUs[/citation]

the i5 750 has alot better price per performance than amd cpus at the same price
 

luke904

Distinguished
Jun 15, 2009
142
0
18,690
[citation][nom]saikyan[/nom]i7-950 is at Microcenter for $200 now and the i5-760 for $170.I see absolutely no compelling reason to go AMD. Inferior performance for negligable savings.[/citation]
intel cant compete AT ALL for cpu's under 100 dollars
 

bildo123

Distinguished
Feb 6, 2007
1,599
0
19,810
[citation][nom]stingstang[/nom]AMD is most certainly not doing great if they have to rerelease all their chips. Here's what happens: They make a batch of chips and sell them all as 4 core processors at X speed. The ones they don't sell or are returned go into stress testing. Those batches are divided in to x2 or x3 piles depending on how stable they are with which cores enabled. The winners of the tests get promoted and branded as new, faster chips with x+100 MHz. The process then repeats.Now if you'll look, their third iteration of this process still doesn't match intel's entry-level i7 processors. It's just embarrassing is what that is.[/citation]

How is it embarrassing? They aren't really making any massive or even moderate architecture changes that would enable the CPU to "catch up" to an i7. It's basically a higher quality chip that just so happens to go a little faster than its direct predecessor. Also, from a price perspective all the stars and planets align quite well, if anything in AMD's favor actually.
 
Sorry, this is a line item, not a full article; it just doesn't matter. Pretty much any CPU available today is "good enough" for most tasks, and if it isn't, you probably know why, and what you need. I'm not sure that will change even when SB/BD come out, but hopefully prices will drop a lot.
 

pinkfloydminnesota

Distinguished
Mar 4, 2010
181
0
18,680
it'd be nice to see what six cores can do with GTA IV/EFLC as had been pointed out by others, BC2 as well. It's not about everyone's fav game, it's about using games that are known to perform significantly better as you add cores.

so far as intel/amd fanboyism is concerned, this could be solved easily by including per dollar performance, which is what matters to most of us. given the vagaries of capitalism, i think we'd find mostly it's a tie.
 

tom thumb

Distinguished
Apr 18, 2010
181
0
18,690
Looks like there is a fine line here between intel and AMD. I would give AMD the upper hand in performance/$ DESPITE the fact that they used the 920 in these benchmarks, and not the 950 which is the one to get these days.

It's a very fine line. I think for most people it will come down to if they can give up triple-channel memory in exchange for value.

... but then, you will gain value when you buy triple channel ram.

It's impossible to definitively say "this one is better than this one" at this point. The CPU market is very balanced.
 

jj463rd

Distinguished
Apr 9, 2008
1,510
0
19,860
On the gaming benchmarks wouldn't it make sense to compare this 6 core CPU on a game or simulation that can use the extra cores (like Microsoft Flight Simulator X).Also have the 6 core monster i7-980X in there too along with other representative quad core CPU's from AMD and Intel.
 

Aoster87

Distinguished
Sep 18, 2008
211
0
18,680
I'd like to see higher resolutions thrown in for gaming, as well as an i7-950 since they are significantly cheaper now.
 

rohitbaran

Distinguished
I don't think Sandy Bridge will have a huge performing increase over Nahelem architecture. Intel themselves have publicized primarily the improved graphics performance. So, I don't think Sandy Bridge will pose a greater threat than the current Intel processors pose. It is however a matter of time before the actual benchmarks are revealed.
 

bradkman

Distinguished
Dec 9, 2010
28
0
18,540
Why are they using a motherboard with the 790fx chipset, instead of the 890fx chipset? The intel chips are using their newest chipset, why not AMD? We might see a slight performance increase by using the new technology!!!
 

Chris_TC

Distinguished
Jan 29, 2010
101
0
18,680
Some suggestions:
1) Use a more complex scene in 3ds max. 30 seconds for a frame is way too short to get meaningful comparisons. And plus, who on earth renders frames that only take 30 seconds ;-)
2) Crysis is a great gaming benchmark for GPUs, not so much for CPUs. My lowly dual core can max it out easily. A decent game for multi-core CPUs is for example GTA IV. It eats cores alive.
 

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
[citation][nom]luke904[/nom]you idiot...it makes intel look better because when you lower the resolution, the graphics bottleneck is removed (mostly) and the higher performing cpu shows through.if you tested it at a higher resolution then the results would be closer together because the higher end cpus would be held back more.accept it.the i7 920 is a great cpu and is more powerful than alot of what AMD has.btw- i run a amd 955[/citation]
At least for now DX11 games rely much more on GPU than CPU. So in the end even budget Intel/AMD processors will perform nicely when paired with high-end GPU.
 

rhinox

Distinguished
Dec 9, 2010
3
0
18,510
the 790 chipsets don't know how to implement all of the power states for the thubans. You must use the 890's. Silly little Tom's. check out Anand's review. Its a 100 watt difference

Pabst must be embarrassed at what you have become


 
Don nice article as always.

When you get a chance, would you mind comparing the hexcore offerings from Intel and AMD across a wider range of graphics resolutions to answer KT_WASP and others who have raised this issue?

Bottlenecks aside, it seems like a good little article on its own, worthy of some attention.

:)
 

SteelCity1981

Distinguished
Sep 16, 2010
1,129
0
19,310
Def a plus for us pc users running AM2\AM3 systems. This gives us users even more life to these platforms. I don't plan on building a new pc for the next compule of years. So this is def a plus for me to upgrade to an Phemon II X6 soon on my AM2+ platform.
 
Status
Not open for further replies.