Maxwell Goes Mobile: First GeForce GTX 970M Benchmarks

Status
Not open for further replies.

MrMusAddict

Honorable
Jun 13, 2013
13
0
10,510
Wait what? I thought the whole reason that nVidia skipped the 800 desktop series was because they wanted to accurately represent mobile chips as less powerful; mobiles have 800, desktops have 900.

Or did they simply skip the 800 desktop cards so that, from now on, desktop chips in a series will release before their mobile counterparts?
 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460
Wait what? I thought the whole reason that nVidia skipped the 800 desktop series was because they wanted to accurately represent mobile chips as less powerful; mobiles have 800, desktops have 900.

Or did they simply skip the 800 desktop cards so that, from now on, desktop chips in a series will release before their mobile counterparts?
Wait what? I thought the whole reason that nVidia skipped the 800 desktop series was because they wanted to accurately represent mobile chips as less powerful; mobiles have 800, desktops have 900.

Or did they simply skip the 800 desktop cards so that, from now on, desktop chips in a series will release before their mobile counterparts?

They did a similar thing with the gtx 300 series. This isnt a surprise.
 

ferooxidan

Honorable
Apr 15, 2013
427
0
10,860
"damn damn damn damn you Nvidia!" says those who own 870M and 860M in their laptop because the price was very appealing.
Anyway, based on the 970M spec sheet, I'm guessing it will be appropriate for that spec to be used for gtx 960 desktop if they are making one.
 

flowingbass

Distinguished
Oct 28, 2010
152
1
18,695
Boy am i glad i still hadnt succumbed to the urge of buying one of those Asus G750J with an 870m in it.

For the past year i keep finding myself gaming more and more on my old gaming laptop, sitting on a very comfortable couch with your feet up while using a controller is just so much better than being stashed in my bedroom whole weekends playing on my desktop. Gf can watch tv while im busy gaming along beside her on the couch while carbo loading on food haha!

I would really find it hard to resist if they release a future laptop gpu with a 780 ti class performance. I just might wholeheartedly buy that right away.

I wish amd could have something competitive on the mobile side just to keep prices down. An r9 m290x is just not in the same league of even a 880m what more of the 980m.
 

Shneiky

Distinguished
nVidia going the Intel way - sacrifice desktop performance, for mobile efficiency. Guess I am just getting old, and how fast my workstation renders out is not a prime goal for manufacturers as it was in the 2005-2010. Mobile devices sales and lack of OpenCL/OpenGL is what kills people like me, every day, slowly...
 

epicfail331

Reputable
Oct 7, 2014
1
0
4,510
Origin PC also does SLI Laptops. Will there be a 980M benchmarking as well? I must say that trying to save for a DominatorPro with 880M in it I might be better off waiting for these to come out.
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860
That is a heck of a lot of power for a laptop to dissipate. I also don't know too many 9->15 cell Li batteries that are capable of dishing that many Wh with the CPU running full tilt on top (Haswell or not)

Nearly every modern gaming laptop heavily throttles the GPU when running on battery exactly for this reason.

nVidia going the Intel way - sacrifice desktop performance, for mobile efficiency. Guess I am just getting old, and how fast my workstation renders out is not a prime goal for manufacturers as it was in the 2005-2010. Mobile devices sales and lack of OpenCL/OpenGL is what kills people like me, every day, slowly...

Ummm what? I guess you haven't heard of desktop 970/980 release and how they offer same performance for way less $ and power consumption, allowing you to get way more for the same money/energy/heat? In the end you DO get more performance. Also you don't use GeForce in a workstation, that's what Quadro is for - and you get your better OpenGL support with it. What's killing you is looking at products not intended for you and thinking that they are.
 

jasonelmore

Distinguished
Aug 10, 2008
626
7
18,995
these gpu's are not true maxwell mobile parts. These are just highly binned Desktop GPU's being converted for mobile. So right now, nvidia is only tapping out 2 maxwell die's period across desktop and mobile.

The GTX 980 die and the 750ti Die. every maxwell part uses those die's mobile or desktop.
 

Shneiky

Distinguished
amk-aka-Phantom,

You are just plain wrong and make stupid, blind assumptions.

I am currently at work, having a break and I am writing this from a HP Z420 with Xeon 2650v2 and a Quadro K4000 machine. We are using the complete Adobe CS6 and complete Autodesk Maya and Max suites 2015. This Quadro K4000 and its drivers are horrible. I have a scene in Maya set in meters and any time I zoom out 15 units away from the 3D model, the viewport displays corrupted geometry. Big patches of pitch black. The issue is so severe that it impacts my work. I am using the much recommended Viewport 2.0. Any options I tried do not fix the issue did not help. Switching from DX to OpenGL does not help at all. Texture corruption and more follows. I used any version of any type of drivers I found. (switched around 10 of em, clean installations of windows, etc). Wrote to both Autodesk and nVidia regarding this issue - nothing. All 4 Z420 machines with the Quadro K4000s have this problem both in Max and in Maya.

This problem is so dramatic, that I take my work from the office and when I get off - I go to my own GTX 650TI to load the scenes, because the dumb GTX 650TI just works. If you really think Quadro's are the bread and butter and savior - you are deeply wrong. Quadro drivers are worse then GTX drivers and the so-called support is laughable. I used many Quadro's, GTs/GTXs and Radeons over the years, and my-oh-my GTX cards have been mostly rock solid (with few exceptions), while the Quadro's got worse and worse. Sure, the Quadro K4000 can push 4-5 times the millions of polys my personal GTX 650TI can, but with all the corruption - that does not matter. Everyone in the office who is on the Z420s with the Quadro's is envying the guy next to us with a non-branded machine and a GTX 770. Things on the GTX just work.

Please, get some hands on experience with these, before you start commenting.

As it goes for the other point, GTX 970 and 980 are supposed to be high-end cards for enthusiasts. If they had 50W more of juice, those were going to be some impressive results. And for many generations, the top-of-the-line always pushes the envelope to the limit. Just because AMD does not have any competition, nVidia is not pushing the performance front as it was. Same story on the CPU side. When your competitor can't match your performance, you start pushing efficiency. nVidia and Intel are both going for small increments in performance because AMD is riding in the back seat, and while OpenCL and HSA are still a vision in the future, things are not going to change soon.
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860
Shneiky,

Whatever your experience may be with Quadro/FirePro, the point is that nVIDIA/AMD do not expect you to use GeForce/Radeon for professional work. It's true that I never used WS cards myself (although I have friends who do, and they don't run into such issues, luckily), but I'm speaking from the point of nVIDIA: they are aiming GeForce towards gamers and not WS, and they aren't "sacrificing" anything, you are complaining about desktop GPUs allegedly not meeting your requirements in a news article about laptop gaming graphics. I repeat myself: umm, what? Your 770 isn't top-of-the-line anymore and there are plenty new and more powerful cards for you to buy if you want more performance. nVIDIA is right to focus on power efficiency, especially in laptop graphics - it enables smaller form-factors and better battery life. If you are not happy with the way they handled GTX 970/980 for desktop, get OCd versions, OC them yourself or wait for Maxwell-based 980 Ti/Titan X/whatever they'll call it. The power envelope isn't set in stone, it's simply what the card requires at nVIDIA reference clocks and the numbers are very impressive, considering the performance at this wattage, especially compared to previous generation. Maxwell OCs well, most factory-OCd 970s I saw tested match the stock 980, and I bet that the 980 can do even more. So while I may not be aware of your quite non-standard situation (again, all my friends who use Quadros for work don't have such issues), your initial complaint about nVIDIA allegedly sacrificing something is what's wrong here.
 

Shneiky

Distinguished
You still do miss the point. nVidia adopted Intel's plan. HED and Workstations do not need such efficiency as a laptop, but Mobile takes front place. So you create a chip for mobile and try to squeeze out some extra performance on the desktop so such a product will actually make sense. What I am doing here is ranting that the days of massive performance boosts are gone in favor of other parameters. And yes, they are sacrificing a lot of stuff that you do not give a damn, because you only look at gaming benchmarks.

Anyway, I did never said GTX 770 is top of the line or I never said I have one. We have one at the office which mops the floor of the K4000s.

The problem is: we need CUDA for software reasons, new K Quadro's bad drivers and lack of proper support make them unusable, new GTXs have 1:32 double precision. It is like stuck between a rock and a hard place. nVidia's own Fermi based Quadro's and GTXs are mopping the floor with their Kepler fiasco, although GTX Kepler is doing remarkably well. nVidia does not expect me to use GTX for professional work, but I do not have other options when we are stuck to CUDA and their Quadro's do worse job for 8 times the price (GTX 650TI - 100 euros, Quadro K4000 800 euros).

What is truly wrong here is that nVidia started their CUDA fiasco and CUDA is inferior in every way to OpenCL, but just because it was 1st - it still exists (and because nVid pays certain developers to use CUDA only). And when someone complains about their fiasco - fan-boys like you jump all over. If you don't want me to rant about their GTXses - well make them fix their Quadros. Because specially in the 400, 500 and 600 series, GTX was the workstation card of choice, not because they were cheaper, but because they worked.
 


For at least a decade. Clevo custom build laptops to your specifications .... pick base model and then options from a lit of available componentry. Most of the boutique brands (FalconNorthwest, WidowPC, VoodooPC, and even Alienware pre-Dell purchase are / were Clevo units). Can be purchased thru distributors throughout the world.

http://www.lpc-digital.com/sager-np9377-special.html

As for the desktop laptop performance issue, you can control how laptops work performance wise both on battery and when plugged in .... simply a matter of adjusting the power profile.

We use GeForce cards which satisfy all our CAD based needs quite well....AutoCAD 2D and 3D actually perform better on GeForce than they do with Quadro. Even Maya does surprisingly well ...though if using SolidWorks, Quadro is the only way to go.

http://www.tomshardware.com/reviews/specviewperf-12-workstation-graphics-benchmark,3778-9.html
 

fwupow

Distinguished
May 30, 2008
90
0
18,630
Mobile GPUs in SLI !? That made me do a double take. Far out! It's just freaky at first because you're taking a power GPU and watering it down to half it's capability and then adding a second one to get back up to about the original performance level. This only makes sense if you can't dissipate enough heat from a single GPU chip running at full strength.

This is really a problem nowadays. We got GPUs that are consuming more power and generating more heat than CPUs but there is no room to install the same kind of dual-fan toaster sized heat sinks that have become common on top of CPUs. The manufacturers are trying to stretch the heat sinks out along the length of the card (and then some) and use multiple high-speed fans, but lets face it, these cards are running H-O-T and air cooling just isn't really getting the job done. I think multiple GPUs on a single card may be the way to go.
 
SLI GPUs has been rather popular at the high end for quite some time.....if memory serves, Clevo released the 1st one back around 2005 - 2006 .... was a 19" screen as I recall

Two GPUS on a single card usually cost more and don't clock as well because of the heat issue.

EDIT : Found it Clevo M590K ... sold by Alienware and several other brands.

http://www.notebookreview.com/news/clevo-guide/
 
Status
Not open for further replies.