AMD Phenom II X6 1100T Review: The New Six-Core Flagship

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cleeve

Illustrious
[citation][nom]rhinox[/nom]the 790 chipsets don't know how to implement all of the power states for the thubans. You must use the 890's. Silly little Tom's. check out Anand's review. Its a 100 watt differencePabst must be embarrassed at what you have become[/citation]

There are more 790 series chipsets in the wild than there are 890 series chipsets. It's a valid test bed, despite the fact that you'd prefer to see this particular result buried.

The only thing that embarrasses me is seeing a fellow member of the human race post sensationalist drivel like your comment on our forums. :D
 

cleeve

Illustrious



Thanks Reynod, I appreciate the kind words.

I will say that i think a lot of people have missed the point. This is a CPU comparison. I'm not looking to push the resolution high enough to make the graphics card the bottleneck; that's the OPPOSITE of the goal here. That's why we keep the resolution low, to make this a processor comparison and rule out graphics card limitations as much as possible.

Resolution gets raised when we're testing the graphics card, not the CPU.

Having said that, I do periodically try to sneak a gaming article in the mix once in a while that concentrates on processor performance in different games (case in point: the sub $150 CPU game-off:
http://www.tomshardware.com/reviews/gaming-processor-core-i3-athlon-ii,2666.html)

I'll get around to looking at this on the high end of $150 if enough readers folks would like to see that, and by the sounds of it that's what people would like to see... :)

 

TW_Honorius

Distinguished
Dec 9, 2010
17
0
18,510
[citation][nom]KT_WASP[/nom]This article is questionable, and I'll tell you why.Look at the game benchmarks, 1280x1024 resolution? Why would the author of this article use such low settings? I'll tell you why, because at that setting, the included Intel CPUs look better. Every other review site used higher resolutions and the results come out very different. Check for yourself and you will see.For example, look over on guru3d.. it shows the i7-980X vs the PIIx6 1100T in FarCry 2. They picked that game because you would see the differences of CPU more as modern GPUs can handle that game with no problems. at 1280x1024 the i7-980 decimates the AMD counterpart. But, get past 1600x1200 and all of the sudden the 1100T is neck and neck.. get upto 1920x1080 and the PIIx6 surpasses the i7-980X.Go look at all the sites.. really do... and you'll see, that once the resolutions go up, the field levels out dramatically.Who games at 1280x1024? you? I didn't think so... Misleading game benches just to make the i7-920 look better is pretty bunk IMO.[/citation]


Sorry to burst your bubble, but you are wrong about gaming resolution, check here: http://store.steampowered.com/hwsurvey/ under Primary Display Resolution, 1280 x 1024 is the third most popular screen resolution by Steam users, and I run that resolution because I use my 55 inch tv sitting 15 feet away as my computer screen, so dont make a statement without proof.
 
G

Guest

Guest
Well done AMD. catching up a bit more now. Ive been useing AMDs since the K6-2 & K6-III days, cache memory was king IMHO. Since then a Duo was best value, but I might consider a AMD next again.
 

GaMEChld

Distinguished
Dec 29, 2009
52
0
18,630
[citation][nom]Cleeve[/nom]There are more 790 series chipsets in the wild than there are 890 series chipsets. It's a valid test bed, despite the fact that you'd prefer to see this particular result buried.The only thing that embarrasses me is seeing a fellow member of the human race post sensationalist drivel like your comment on our forums.[/citation]

I thought the proper protocol for benchmarking a component is to eliminate other components as potential bottlenecks before putting the device through its paces. As such, the 790 chipset does not become the preferred testbed just because there are more of them out there. If the 890 unlocked more of the performance of the chip, it should be used. It's just laziness.

That said, I doubt the 890 would've made much of a difference either way, and as someone else said, this should have been a line item article, not a full worked up benchmarked article. But, if you are going to go through the effort of benchmarking a chip like this, you may as well have the current chipset for that chip.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
Performance per buck would be indeed a nice addition. (say one number per area: video, games, apps).

Even though "720" is not really an "entry CPU", it's clear, that AMD is stil behind, which should be VERY BAD NEWS also for those, who buy only Intel CPUs, no matter what. You will soon feel what I'm talking about, when intel will kill overclocking for the biggest part of the market with Sandy Bridge. (and remember, it's not because they want to make more money, it's for your own good, so that they ensure you have a "stable" CPU)
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]Chris_TC[/nom]Some suggestions:1) Use a more complex scene in 3ds max. 30 seconds for a frame is way too short to get meaningful comparisons. And plus, who on earth renders frames that only take 30 seconds ;-)2) Crysis is a great gaming benchmark for GPUs, not so much for CPUs. My lowly dual core can max it out easily. A decent game for multi-core CPUs is for example GTA IV. It eats cores alive.[/citation]

I'm 3ds max user myself , while i can certainly see the validity of your point about renders, i stil disagree , as you obviously never went to any formal game art school. I'm curently a Student majoring in game art design , we quite often ahve to render still of cahracters we do in class , and what not , none of these character renders take over 30 seonds even when dealing with high poly chars (100k +) so thier benchmark is stil quite valid , jsut on alimited spectrum of 3ds max users (but then again consider the fact most ametetur modders only use max to model clothing mods and character mods, and you coudl easily say teh amjority of max users actually render well un the limits of what woulde at more time than 30 seconds) Also keep in mind almost any thing made for modern games (excluding the rare dx11 games) would only use a diffuse , specular, and normal map at most , and again none of these texture types really add to render times that much.


but i can see your point when dealing with max users taht work in teh movie industry for instance ., those scenes can quite often take days to render out 30 seconds of film , but then again any serious film company wil be runign a farm with no less than 10 machines so the cpu power in that regard is rather moot as long as all ten amchines are runnign fairly recent cpus.

and i agree crysis was a dumb-f--- choice for a cpu test. im on an athy 94 x2 5000+ black ed oced to 3 ghz with 3 gis ram and a single radeon 5770 and crysis runs very nicely on max settings for me ( on dx 9), 30-40 fps to be precise. not teh best times but certainly playable.
 

The_Trutherizer

Distinguished
Jul 21, 2008
509
0
18,980
I think it's amazing what AMD is doing with the old 45nm fab process. I can't wait to see what their performance is going to be when they finally go to a more compact process. When Intel transitioned, it felt like they doubled their performance. I firmly believe that if Intel was not on 32nm then AMD would have kicked their pants off by now. So when fusion comes out expect more than just a performance increase from the better architecture.

 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]article[/nom]Fusion and Sandy Bridge might be around the corner, but AMD isn’t waiting for the next generation to deliver value. The Athlon II and Phenom II lines continue to offer very impressive performance for the price. Would we recommend an upgrade today, though, knowing that Sandy Bridge is a couple of weeks away, and the first Brazos-based CPUs are going to be unveiled at CES? If you can, it certainly seems like a better idea to wait. After all, the computing landscape could very well change in less than a month.[/citation]

Exactly what I was thinking about. It is certainly nice for the prices to remain as the performance increases. But this should have happened sooner to cement their value motto and that would've enhanced sales too. Then again, maybe this launch has more meaning than we're aware of.

BTW,nice article but when you're comparing value, its better to compare products that exist on the market (the i7-950 instead of the 920) to provide a realistic picture of the performance delta (however insignificant it might be).
 

cleeve

Illustrious


The 890 performs no better than a 780g. Testing has proved this, and AMD's own list of suggested boards for testing the 1100T includes a variety of chipsets. It makes no difference.

If this hadn't been proven otherwise, I might agree with your argument. But we know from experience that this is simply not the case, and if AMD is suggesting non-800 series chipsets to test the new CPU you can be pretty sure there's no hidden performance to unlock with the 890 chipset.
 

ethaniel

Distinguished
Jul 10, 2005
151
0
18,680
And I just thought of leaving AMD past march to adopt LGA 1156... good thing I didn't. Now Intel has to defend four (yes, LGA775 is still around), incompatible sockets, wreacking havoc among consumers, while AM3 still has a lot of fuel. Stoooopid Intel...
 
Thanks for doing another article to show me how awesome my i5-750 STILL is compared to AMD's latest offerings. LOL

To see the new AMD 1100 compare so close to the i5-750 and i7-920 in some of those tests is kind of depressing honestly. AMD needs to challenge Intel more.
 

kathiki

Distinguished
Apr 9, 2010
56
0
18,630
"Quote" Nvidia is defeated...hardly they completely dominate the workstation/scientific segment where the highest profit is and are right on par basically in the gamer segment. "Quote"

I would elaborate on that phrase a bit more..... Ati did not win, it is Nvidia that lost hands down with their extremely stupid profiteering policy.... I am the fastest so i sell the most expensive.... In times like this when the world crisis is on the door of every country, models like 5850 show what a gfx card should be like..... reasonably fast and very cost effective in terms of consumption..... as a system builder post it above MOST people nowadays use pcs for internet access, email, wathcing movies on their pc, and of course DOWNLOADS... Do you really need a card that will keep you warm in the winter or a card that you can leave idle all night consuming pretty much 3 times less electricity.....

We are just blown away by some benchmarks seeing NVIDIA surpassing ATI by 2 fps - 20 fps and few of us realise that the monitors we are using CANNOT handle these fps....

I have stopped searching for all this benchmark religious crap... i have an 5870, i am really satisfied and will just laugh at people that even think of changing their cards for something that brings 20% more fps....

Think about it....... 60 fps per sec + 20% = 72fps...... can your monitor handle this? are you the sort of gamer that can also PERCEIVE the difference?
 

Vermil

Distinguished
Jul 22, 2009
107
0
18,680
I will not head the advice to wait. Sure, the landscape may change in a month. So what? No matter, AMD's 6-cores are good enough allrounders and cheap enough. Many expected CPU breakthroughs rely on software to adapt to new extensions. It will take years for such CPUs to fulfill their promise in reality, whilst the effect will likely be immediately displayed in new benchmarks. I'm quite fed up with CPUs that don't live up to their published benchmarks, which I feel was the case with both P4 and C2. I'm sure Intel's Nehalem derived CPUs are much better. I still won't get any. Because I've finally caught on to the fact that Intel is a filthy-swine company and I feel tainted by giving them my money. And I'm exstatic with my current Ph II, so why wouldn't it get another Ph II replacement? Feels like the way to go as a consumer. If you're happy with something...
 

mihaitzateo

Distinguished
Dec 7, 2010
29
0
18,540
The new Intel Sandy Bridge i5 2400 was benchamarked,and in most tests there performed better than AMD 1095T X6 cpu (in some tests not by much,in other performed lower than 1095T).
I guess that 1100T is better than i5 2400 but the price is higher (1100T at 300$,i5 2400 is said to come at 200$ or less).
So I think AMD needs to do something cause else the situation will not be so good for them.
At the moment in the low end market AMD have an advantage with their Phenom X2 which can unlock to Phenom X3 or X4 (on cheap boards) and if unlocked perform better than i3.No ideea how i3 2100 (sandy bridge equivalent of i3,2 cores/4 threads) will perform.
 

jeff77789

Distinguished
Jun 24, 2009
198
0
18,690
middle paragraph on the last conclusion page:

But Fusion isn't here yet, so let's concentrate on the here and now: the 3.0 GHz Phenom II X6 1075T is $200 at the time of writing, but it performs better than the Core i5-750 in most applications, and is generally on par with the Core i7-920 in our benchmarking suite. Priced $35 higher, the Phenom II X6 1190T Black Edition sports an unlocked CPU multiplier for overclockers and is just about as likely to hit 4 GHz as the Phenom II X6 1100T flagship. Enthusiasts will likely consider the 1090T about as good as the newer chip priced higher. Finally, $265 gets you AMD's fastest hexa-core desktop CPU $600 below Intel’s entry-level six-core model.


it says Phenom II x6 1190T black edition
 
[citation]Think about it....... 60 fps per sec + 20% = 72fps...... can your monitor handle this? are you the sort of gamer that can also PERCEIVE the difference?[/citation]
Actually, NO user can perceive the difference because 60fps is all that the human eye can see. To be able to tell the difference, one would have to be Superman! :sol:
 
[citation][nom]Aetherys[/nom]This makes me worry for AMD. Really don't want to see them perish.[/citation]
As long as people like you and I keep buying AMD, they won't perish. There's enough of us. If there weren't, AMD would be dead now. Also, keep in mind that ATi is making money hand-over-fist and that is going a long way to keeping AMD afloat. I think AMD will be a thorn in Intel's side for a long time to come just like George Bailey was a thorn in Potter's side! :sol:
 

fwupow

Distinguished
May 30, 2008
90
0
18,630
Wait? Too late! I just built a new system yesterday using the Phenom II x6 1090T and an Asus M4A..890GX mobo. I doubled my video encoding performance compared to my previous Intel Q6600 Core 2 Quad system. I did a test converting just the video of chapter 1 of the Movie "HULK". On my old Q6600 system (no overclock) converting to H.264 using the X264vfw codec, took 2min-44sec. Same conversion on my new AMD system took 1min-22sec. X.264vfw is a 32bit codec but it slams every core you got to max utilization. DivX 6.8 also utilizes all your cores but not to 100%.

So if encoding/transcoding video is what you do, I think building an AMD system with the hexacore 1090T is a real bargain. The Tom's Hardware test here seems to indicate that if you overclock the 1090T by 100MHz, you get the same thing as an 1100T.
 

fwupow

Distinguished
May 30, 2008
90
0
18,630
One other thing. I really like the way the Phenom II x6 cpus drop down to 800MHz when idling. I'm surprised that this review didn't include an idle power consumption charting.
 
[citation][nom]fwupow[/nom]One other thing. I really like the way the Phenom II x6 cpus drop down to 800MHz when idling. I'm surprised that this review didn't include an idle power consumption charting.[/citation]
From what I understand, the power consumption numbers match the numbers of the Phenom II X4 9xx series CPUs so it's possible that they didn't want to repost old numbers. :sol:
 
Status
Not open for further replies.