Radeon HD 6990M And GeForce GTX 580M: A Beautiful Lie

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

PCgamer81

Distinguished
Oct 14, 2011
1,830
0
19,810
People have the right to name their products what they want.

It is NOT a 6990. It is a 6990m. That is what they want to call it, and it is their right.

I would expect anyone with half a brain (especially when shelling out $2500+) would do a little research first.

I had the ASUS G73 with the 5870m, and I knew full well all it was was a renamed 5750.

Don't blame AMD that consumers are morons. And don't try and force them to rename their products to compensate for that sad fact. If they wish to name their GPU's in a way to take advantage of the public's stupidity, than they should be allowed to.

It is the nature of the capitalist beast; a beast that roams the western world at will.

So stick it.
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
Conviniently Tom's stopped benchmarking nvidia's solution after really demanding games like Metro 2033 came in. AMD's solution would have decimated that GTX.
 
G

Guest

Guest
nice read but....1080P?....
even i consider 1920*1200 not that high rez at 17 inch.......(im partly visually impaired).
 


RiiIiight... 'cause that's such an inconvenience when buying a $2k laptop. :heink: All that Googling must be hard. :sarcastic:

It just comes down to amd/nvidia bullshit marketing. Although I am an enthusiast I still find these names to be bs and require more thought then they should. I shouldn't have to remind myself that a 6990m is a 6870 desktop. It should just simply be 6870M no reminders needed. I would even understand it more if the mobile variant at least had something even in common with its desktop variant. For instance I7-2630QM and I7-2600 desktop both feature 4 cores and 8 threads and are built on the same platform but much different core speed, atleast they have something in common...

Sure.. and that 2 Core Quad Thread i7-2620M which is numerically bigger than the i7-2600K number-wise is so close in performance :pt1cable:

Let alone the ridiculously slower i7-2610UE which is half the cores and less than half the speed/freq of the desktop version, for about a 1/10 of the performance in anything you'd want a core i7 for... yeah, it's only AMD and nVidia. I'd say any of theirs' is far closer in performance to their desktop counterpart than that i7-2610 is. [:thegreatgrapeape:5]

Me thinks too many of you folks don't own or purchase enough mobile devices. :pfff:

I don't give a rat's A$$ that my HD6850M is not as fast as a desktop HD6850M, and my 2630QM isn't as fast as a 2600K, I knew that when I bought it, but I also know what I was buying and for the price and with a few RAM/SSD tweaks I also knew it would blow-away the majority of desktops out there for editing and gaming despite these shortcomingsm even though it might get nowhere near the testosterone overclockers' rigs.

Seriously, this article needs a companion on intel's horribly deceptive CPU naming too in order to have some semblance of internal validity & proper balance. :non:
 

azeemtahir

Distinguished
Dec 12, 2011
5
0
18,510
I don't get it. I mean, we've all known for quite a while that mobile counterparts (of anything, for that matter) would always be slightly (or sometimes DRAMATICALLY) lower in performance than their PC counterparts. I mean, compare the size of the two for God's sakes! It's like saying that HEY! My headphones aren't half as loud as my Hi-Fi - (Well dumb-o, note the difference in the 40mm drivers vs. the 250mm ones in your boombox?). It's like saying that my BIKE runs way, WAY slower than my CAR. Yeah, it's 600CCs, but I wanted it to be 2400CCs (same as my car) and hold 4 people, too.

I mean, come on! A laptop is only about twice the weight of a top of the line and flagship AMD or Nvidia GPU! The heatsinks and the card lengths! All that for some showoff? Well, n00bs would think so...

I like where this article is going, but most of us already know all that. I mean, the naming scheme is just so to identify that if the 580 is the best Nvidia chipset in the single-GPU based DESKTOP solutions, so that 580M is the best single-GPU in MOBILE solutions. Simple. Like we say that Dr. Dre beats are the Ferrari of Headphones. So, if Ferrari was into making headphones, Beat Studio would be what they'd be making. So you can surely call them the Enzo(s) of headphones, but you can sure as hell not ride in them, let alone at 200MPH.

Bottom-line: Where's the Confusion?
 
G

Guest

Guest
I get their soapbox about this issue, but really. So what? They both are doing it. It's not going to change.

The real question I needed answering was which laptop graphics solution is better for gaming?

It seems the 580m has a very slight, somewhat non existent advantage over the 6990M. Is that right? Some games the 580M beats the 6990M handily...games like SWTOR. On the other hand, the 6990M seems to beat the 580M on Skyrim. This is the information I need. I noticed that both cards got scores of 35-38 fps on Skyrim in max everything settings with the 6990M slightly better. SWTOR saw scores in the 45-50 fps range (45 for 6990M and 50 for 580M) under the max settings. So which card is better? This is what I'd like to know both from a straight up performance perspective and cost perspective.
 
G

Guest

Guest
Thank you so much for this article. I was just about to blow $2200 on an Alienware M17x outfitted with the Radeon 6990M. I'm absolutely horrified by this; a desktop with the Radeon 6990 is going to be $1500 (about 30% less) and perform up to twice as well? That's pathetic. Wow. Thank you for saving me $2200.
 

mcgrath678

Honorable
Feb 3, 2013
2
0
10,510
I can understand why people here are mad at AMD/ATI and Nvidia for giving mobile products similar names to desktop counterparts without the performance to back the names up.
I have noticed a trend on internet that younger and younger kids will find the truth on the internet
and this is not the first time that I have seen misleading information given on computer components
look at the speed ratings of SSD drives , they never come close to the speeds written on the box
and I never get 1 TB of usable space on a 1TB drive .projector tv companies lie about Lumens and Ford lies about gas mileage
I am here to say that I am so happy that the day has finally come . If I am willing to spend the money , I can buy a laptop that can play Witcher 2 and Metro 2033 on Ultra settings @1080p with playable frame rates. the 6990m was one of the first cards capable of rendering Crysis in all of its glory @ 1080p
with nice frame rates. I had a PS3 but its graphics capabilities have been getting worse every year, I travel all of the time , so a desktop was not an option. When I started researching gaming laptops, I had no Idea that many companies dont even make there laptops, they just slap a name like Sony on someone elses creation and sell it . After much research , I ended up buying a Clevo / Sager 150HM series with a single 6990m , this laptop really vents some heat while gaming, I have overclocked the card 16% and run everything from furmark,seti,heavenDX11,bitcoin mining@ 150 Mhash. Sager allows user upgrades of the video card without voiding waranty , so I will be looking for a 8970m when it comes out . I also looked at the specs for the upcoming Playstation 4 . it is rated @ 1.85 TFlops , the same performance I am getting out of my year and a half old overclocked 15.6" Sager laptop with a ATI 6990m and mine has a built in 1080p screen,SSD and 8 more GB ram than the PS4.
 
Status
Not open for further replies.