Radeon HD 6990M And GeForce GTX 580M: A Beautiful Lie

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Yargnit[/nom]For the person who asked about the 560/560m, the 560m is basically an underclocked 550ti. I wouldn't even care about the $ from a lawsuit as long as it forced AMD/Nvidia to admit how crappy their mobile GPU's are in the naming schemes from now on. Maybe they'd even work harder on power overall power efficiency if they were suddenly stuck unable to put out a mobile GPU model number higher than 6870.[/citation]

The 28nm GPUs should have a large performance increase over the previous generation. Nvidia seems insistent that they have far surpassed even AMD's new GCN arch with Kepler in what little info they give us and GCN is pretty good.

Regardless of the improvements AMD and Nvidia really should be in trouble for this crap... I mean sure they never fooled me nor anyone I know but that doesn't make it any less wrong. Makes me want to mess with a few salespeople since there's little I can do otherwise and it'll be fun...
 

need4speeds

Distinguished
I commented about gaming laptop purchase. I could not recommend the expensive $1800 gaming laptop vs. the $560 Llano laptop. Why you ask?
The gaming laptop...a massive laptop with a 18" screen weighing in a a heavy 8lbs with a 1.6hour battery life. i7- sandybridge 2.4ghz, and a choice of either sli 580m's or 6990m's. It has a 1080p screen that works the cpu and gpu harder but on the smaller laptop screen 1080p is finer than you need. I would call this a "luggable" and not really a laptop anymore, and with almost a 0hour battery life a shuttle like system with a 17" lcd would be just as portable. It defeats the whole idea of even having a laptop at all. They are supposed to be portable right?

The Llano, A8-3870m, 480 shaders, 2.0ghz, true it is slower. But 17.3" 1366x768 screen has small enough pixels at the screen size to still look sharp, and it gives the apu a break.
The battery rating is 9hours normal tasks and 5.5hours gaming. It is also lighter and thinner.

At 1366x768 the Llano can deliver about the same fps as the $1800 laptop because it has to work harder to deliver 1080p gaming.

AMD would be smart just to ditch the falsely named 6990M altogether and just make a new faster clocked Llano apu for gaming laptops, maybe with more shaders something like 640 or 800.

 

AppleBlowsDonkeyBalls

Distinguished
Sep 30, 2010
117
0
18,680
Don't get why people are complaining about the naming scheme. Laptop and desktop series for both AMD and NVIDIA are SEPARATE. Radeon HD 6990M is meant to represent that it's the highest-tier, fastest GPU from AMD. Same thing for the GTX 580M.

Why anyone would think the desktop and laptop parts are gonna be comparable in performance is beyond me. They've never been, and they never will be. Similarly, the naming schemes are different, because again, they're separate series.

Now stop whining about it. If you're looking at buying laptops with GPUs this powerful and expensive you'd better know what they offer, anyway.
 
[citation][nom]AppleBlowsDonkeyBalls[/nom]Don't get why people are complaining about the naming scheme. Laptop and desktop series for both AMD and NVIDIA are SEPARATE. Radeon HD 6990M is meant to represent that it's the highest-tier, fastest GPU from AMD. Same thing for the GTX 580M.Why anyone would think the desktop and laptop parts are gonna be comparable in performance is beyond me. They've never been, and they never will be. Similarly, the naming schemes are different, because again, they're separate series.Now stop whining about it. If you're looking at buying laptops with GPUs this powerful and expensive you'd better know what they offer, anyway.[/citation]

A lot of gamers are really computer illiterate and would assume that the mobile GPUs are similar or at least not to far off from the desktop versions with the similar model names. These complaints are perfectly valid because we are taking the standing point of less savvy people and trying to contemplate what they would think and I'm convinced we did pretty good at that.

Really, would you assume the the 6990m has less than half the performance of the 6990 if you were less computer literate? Would you assume that the 580m had similar performance to the 6990m when it is actually faster even though the exact opposite is true for the desktop GPUs?
 

husker

Distinguished
Oct 2, 2009
1,253
243
19,670
If create a line of beer mugs called "Schmuggles" and my largest one is the "Smuggle 7000", then I have every right to name a much smaller one the "Schmuggle 7000M". It's my naming convention, and it does not have to adhere to anyone else's opinion on what a logical naming convention should be.
 
[citation][nom]husker[/nom]If create a line of beer mugs called "Schmuggles" and my largest one is the "Smuggle 7000", then I have every right to name a much smaller one the "Schmuggle 7000M". It's my naming convention, and it does not have to adhere to anyone else's opinion on what a logical naming convention should be.[/citation]

We all have the right to be stupid and misleading according to law but that doesn't mean it's any less morally and ethically wrong. A few hundred years ago it was every white American's right to enslave colored people but that didn't make it any less morally and ethically wrong from an non-biased (or at least less biased) view. With computers we enthusiasts are the less biased view and should at least recognize that something should be changed like it has been in the past.

Not that I'm saying misleading naming conventions and slavery are equal evils but they are both wrong anyway.
 
[citation][nom]blazorthon[/nom]We all have the right to be stupid and misleading according to law but that doesn't mean it's any less morally and ethically wrong. A few hundred years ago it was every white American's right to enslave colored people but that didn't make it any less morally and ethically wrong from a non-biased (or at least less biased) view. With computers we enthusiasts are the less biased view and should at least recognize that something should be changed like it has been in the past.Not that I'm saying misleading naming conventions and slavery are equal evils but they are both wrong anyway.[/citation]

Fixed "an non-biased" to "a non-biased in the second sentence.
 

lp231

Splendid
I never find intense gaming to be associated with notebooks due to its physical space limits, sub par specs, and massive heat output. And with both of them going the way of rebadging, I personally gave up eons ago as this practice will continue for a long time.
I still do prefer notebooks with discrete graphics so I can get decent graphic performance, a bit of simple light gaming (or flash games), but I just don't see Crysis 2 or Skyrim running on a notebook.
If you really want a gaming computer, get a desktop.
 
G

Guest

Guest
I have to disagree with the conclusion: "...anyone who expected one Radeon HD 6990M or GeForce GTX 580M to facilitate adequate performance is going to be sorely disappointed."

You complain about AMD and NVIDIA using marketing-driven naming, and yet you're doing the same thing with your benchmarks. I'd hardly call high detail 1080p gaming "adequate" -- that word is reserved for medium quality and all the 1366x768 displays. More importantly, basically maxing out detail and enabling 4xAA is not something most users need, especially laptop users.

You're partially right that the naming is silly, but obviously we're not going to get a real HD 6970 or GTX 580 chip (even at lower clock speeds) into a laptop chassis, let alone two of them. Just one of those cards would use the full power output of the Clevo X7200 brick (300W). Anyway, we'll see what we get with 7970M (probably a lower clocked 7700M) and GTX 680M (probably a lower clocked GTX 660 Ti), but really it's just a matter of knowing what you're getting and setting your expectations appropriately. It's the good and bad of model numbers, but it's really no worse than what we see on the mobile CPU side of the fence.

PS -- Your "new" gaming benchmarks are hardly recent. What about Skyrim, Battlefield 3, Batman: Arkham City, The Witcher 2, Deus Ex: Human Revolution, Dragon Age 2, Assassin's Creed: Revelations, or Call of Duty: Modern Warfare 3? The most recent title in your list appears to be DiRT 3, a game that was released in May 2011 (eight months ago). Time to update your definition of "recent", I'd say.
 

g-unit1111

Titan
Moderator
Interesting stuff for sure, but I want to know how is it possible to test laptop-specific hardware in a desktop configuration and expect to get the same results? If I'm buying a laptop with a high-end GPU I want to know things like battery life and lag times in the actual notebook configurations.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
[citation][nom]aznshinobi[/nom]Agreed, there is an M for a reason. It's the buyers fault for not researching. Most buyers just buy the most expensive product and assume it's good. This will teach them otherwise.[/citation]


No, it won't. Anyone reading this site should already have been aware of this. Anyone who has to read this article to see the truth isn't going to be reading this site in the first place. All this article is doing is preaching to the choir.
 
... So, “common knowledge” works against the partially-informed buyer....

The entire world discriminates against the partially informed, come to think of it. Schools and universities insist on giving these exams which hurt the feelings of the uninformed and the partially informed. Job interviewers do the same, with their questions and tests. You're not even allowed to drive if you're only partially informed about traffic rules... It's time we took a stand and said ENOUGH!!! We the partially informed demand equal rights!

 

Tavo_Nova

Distinguished
Dec 31, 2011
1,159
0
19,360
just because it's expensive doesn't mean it's better than a little bit cheaper one of course assuming like they are both on the same level, afterall a 300$ build would never beat a 2000$ build in 1920x1080 at all high settings well thats just an example but still this is a good read
 
The naming scheme doesn't bother me. If you buy a mobile part named after the desktop's flagship model, you know you are getting the most powerful mobile product that company offers. That seems to make it easier. It's only those in the know that examine it too much and read into it all that have issue with it, it seems.
 

jacobdrj

Distinguished
Jan 20, 2005
1,475
0
19,310
This article reminds me of when Intel would use the name Pentium-M, it actually meant that the processor was both faster an more power efficient... The Pentium-M chips were SO much better than their desktop counterparts (Netburst P4's) that people actually started to build desktops with the P-M's (or, AMD chips) rather than use the flame-thrower that was Netburst... I believe the Pentium-M was the underlying architecture for Core and Core 2...
 
[citation][nom]jacobdrj[/nom]This article reminds me of when Intel would use the name Pentium-M, it actually meant that the processor was both faster an more power efficient... The Pentium-M chips were SO much better than their desktop counterparts (Netburst P4's) that people actually started to build desktops with the P-M's (or, AMD chips) rather than use the flame-thrower that was Netburst... I believe the Pentium-M was the underlying architecture for Core and Core 2...[/citation]
It was :)
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
IMO, if you don't do your homework before investing into something then you deserve to lose money. You wouldn't be buying up Kodak stock without realizing that they are filing for bankruptcy - unless you're a fool with more money than brains.
 

gm0n3y

Distinguished
Mar 13, 2006
3,441
0
20,780
Wow, I knew than an 'M' processor was worse than its desktop counter part, but I thought they were basically just an underclocked version. I haven't bought a laptop in over 10 years but I may be getting one soon and will certainly keep this in mind.
 

Crashman

Polypheme
Former Staff
[citation][nom]Kaldor[/nom]Nvidia may not be as guilty at this point in time, but their renaming and spinning out the core from the 8800, 9800 and 250 cards was epic. Nothing like selling the same old GPU (with minor improvements) for 2+ years.[/citation]Perhaps you're looking for something like this?
http://www.tomshardware.com/reviews/geforce-gtx-280m,2353.html[citation][nom]gm0n3y[/nom]Wow, I knew than an 'M' processor was worse than its desktop counter part, but I thought they were basically just an underclocked version. I haven't bought a laptop in over 10 years but I may be getting one soon and will certainly keep this in mind.[/citation]Thanks for proving the point :)
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
We think its both sad and wrong that two Radeon HD 6990Ms can’t beat one Radeon HD 6990 when complemented by the same CPU, memory speed, and chipset. Pathetic.

There is absolutely no reason why it could. you have 2 - 100W parts vs a 375W part. Whats the max power draw in a mobile platform? The specs between the two themselves aren't even close.

This entire article itself stinks of AMD bashing, while almost priaising Nvidia, especially after looking over the original 6990M vs 580M article.
http://www.tomshardware.com/reviews/gtx-580m-sli-hd-6990m-crossfire,3022-12.html

The Nvidia system pulls ~30 watts more from a power draw rated at 100W than the 6990M.

And exactly what market does these laptops even cater for? The laptop itself was $7000 with 2 6990M or $7600 with 2 - 580M cards.

Considering thats Most likely going to be less than 1% of all laptops sold, I guarantee whoever buys something that extravagant will be doing their homework first, or just has more money than brains.
 

Crashman

Polypheme
Former Staff
[citation][nom]noob2222[/nom]There is absolutely no reason why it could. you have 2 - 100W parts vs a 375W part. Whats the max power draw in a mobile platform? The specs themselves aren't even close.This entire article itself stinks of AMD bashing, while almost priaising Nvidia, especially after looking over the original 6990M vs 580M article.http://www.tomshardware.com/review [...] 22-12.htmlThe Nvidia system pulls ~30 watts more from a power draw rated at 100W than the 6990M.And exactly what market does these laptops even cater for? The laptop itself was $7000 with 2 6990M or $7600 with 2 - 580M cards.Considering thats Most likely going to be less than 1% of all laptops sold, I guarantee whoever buys something that extravagant will be doing their homework first.[/citation]Hmm, a little miffed at focus on AMD's part? Nvidia's been in the spotlight before, it's the guiltiest party who gets the attention. If you find that hard to believe, perhaps this will help:
http://www.tomshardware.com/reviews/geforce-gtx-280m,2353.html
 
[citation][nom]Burninator[/nom]I have to disagree with the conclusion: "...anyone who expected one Radeon HD 6990M or GeForce GTX 580M to facilitate adequate performance is going to be sorely disappointed."You complain about AMD and NVIDIA using marketing-driven naming, and yet you're doing the same thing with your benchmarks. I'd hardly call high detail 1080p gaming "adequate" -- that word is reserved for medium quality and all the 1366x768 displays. More importantly, basically maxing out detail and enabling 4xAA is not something most users need, especially laptop users.You're partially right that the naming is silly, but obviously we're not going to get a real HD 6970 or GTX 580 chip (even at lower clock speeds) into a laptop chassis, let alone two of them. Just one of those cards would use the full power output of the Clevo X7200 brick (300W). Anyway, we'll see what we get with 7970M (probably a lower clocked 7700M) and GTX 680M (probably a lower clocked GTX 660 Ti), but really it's just a matter of knowing what you're getting and setting your expectations appropriately. It's the good and bad of model numbers, but it's really no worse than what we see on the mobile CPU side of the fence.PS -- Your "new" gaming benchmarks are hardly recent. What about Skyrim, Battlefield 3, Batman: Arkham City, The Witcher 2, Deus Ex: Human Revolution, Dragon Age 2, Assassin's Creed: Revelations, or Call of Duty: Modern Warfare 3? The most recent title in your list appears to be DiRT 3, a game that was released in May 2011 (eight months ago). Time to update your definition of "recent", I'd say.[/citation]

I have to agree that 1080p is not just an adequate resolution especially when maxed out but it is becoming very common. I have to say that the naming conventions are more than silly but outright misleading. That and the 6970 doesn't use 300 watts, the reference 6970 has a TDP of 251 watts. Under-clocked and it will obviously use significantly less power. Underclock it to the performance of the current 6990m and it should use less power to get that same performance but it would be more expensive. I find it unlikely that the next top mobile Radeon would use a 7700 card and am at a loss as to what you based your assumption on. That would mean it would have about the same performance as the 6990m or have worse performance and that's before under-clocking... The 7770 is almost as fast as the 6850 and the 7790 is presumably about as fast as the 6870, maybe a little slower. If the next mobile Radeon was based on the 7790 is would be roughly equal to the 6990m in performance while using less power.

Based on the apparent pattern of 7000 cards having roughly equivalent TDPs with their 6000 counterparts I'll assume that the next top mobile Radeon will be based on the 7870 like the 6990 is based on the 6870. I'll also go as far as to say that it will be about as fast as either the 6950 or 6970 based on the performance differences the currently benchmarked 7000 cards performance delta over the 6000s. Moving on, the top mobile Radeon right now is the 6990m so who's to say the next one will be the 7970m? That would be an improvement but you seem to be making up names and failing to objectively look at the GPUs you make assumptions about.

I'll agree with your opinion of Tom's meaning of recent... 8 months is too long when so many newer games are out.
 
Status
Not open for further replies.