Nvidia GeForce GTX 960M, 950M Mobile GPUs Debut

Status
Not open for further replies.

Vlad Razvan

Honorable
Mar 14, 2013
23
0
10,510
" it seems the age of the heavy, chunky gaming laptop is ending" - you're so wrong. First of all, the 950m and 960m GPUs will not satisfy a true gamer (unless all they play is MMORPGs). Specs-wise, the 960m's 650 cuda cores are a little over half of the 1024 cores the GTX 960 (desktop version) uses - this should equal a little over 50% of the 960's perfromance. I for one am not happy with the performance of the desktop GTX 960 in recent games @ full HD (laggy, crawls to a halt if you enable quality textures and shadows due to the slim 128 bit bus) - the mobile version will be even slower.

As a gamer, I want to be able to enjoy content at as high a quality as I can, so I go for the fastest card I can afford. These cards will only fit into "bulky gaming laptops" due to power and cooling prerequisites. Furthermore, I'd rather own a large bulky laptop that can house a powerfull gaming configuration rather then a slow thin "gaming" laptop. As it is, my laptop sits on my desk 90% of the time - I don't even use the built in keyboard and monitor (and why would I when all manufacturers refuse to make anything bigger then 17,3").
 

Quixit

Reputable
Dec 22, 2014
1,359
0
5,960
Well, compared to the new MacBook these are positively enormous. Perhaps our definition of how large a laptop should be is changing, not the relative size of gaming laptops.
 

jase240

Honorable
Aug 4, 2012
116
0
10,690
" it seems the age of the heavy, chunky gaming laptop is ending" - you're so wrong. First of all, the 950m and 960m GPUs will not satisfy a true gamer (unless all they play is MMORPGs). Specs-wise, the 960m's 650 cuda cores are a little over half of the 1024 cores the GTX 960 (desktop version) uses - this should equal a little over 50% of the 960's perfromance. I for one am not happy with the performance of the desktop GTX 960 in recent games @ full HD (laggy, crawls to a halt if you enable quality textures and shadows due to the slim 128 bit bus) - the mobile version will be even slower.

As a gamer, I want to be able to enjoy content at as high a quality as I can, so I go for the fastest card I can afford. These cards will only fit into "bulky gaming laptops" due to power and cooling prerequisites. Furthermore, I'd rather own a large bulky laptop that can house a powerfull gaming configuration rather then a slow thin "gaming" laptop. As it is, my laptop sits on my desk 90% of the time - I don't even use the built in keyboard and monitor (and why would I when all manufacturers refuse to make anything bigger then 17,3").

Most "true" gamers don't run their games at full settings, many lower the quality to guarantee absolute performance and also reduce blur(motion blur and AA can make it harder to play).

A desktop GTX 960 can run every current game at 1080 maxed out, with few exceptions that force you to disable a few non-essential settings. This GTX 960M should satisfy nearly every gamer that is willing to use a thin laptop to game on.

I too believe the days of bulky laptops are coming to an end, it's about time.
 

Kraszmyl

Distinguished
Apr 7, 2011
196
0
18,760
Surprised no one has commented on the fact the 950 and 960 are just rebadges of the 850 and 860 with a slight clock boost and no kepler variant to muddy stuff. Honestly cant blame them this time tho considering Maxwell is still for the most part Maxwell and it makes the lineup easier to read.
 

fatboytyler

Distinguished
Jan 29, 2012
590
0
19,160
"...it seems the age of the heavy, chunky gaming laptop is ending."

This is utterly false. This is the exact reason why I went with the MSI GT70 or the GS70 Stealth. You can't physically use the fullest extent of your hardware if it hits throttling temps within a few minutes of rendering or gaming. Thin gaming laptops have obvious temperature and throttling issues, not to mention the ultrabook style can make it quite difficult for maintenance. My GT70 is like an apartment in there and couldn't have easier user maintenance. Sure its almost 10 pounds and over 2" thick, but that style will never die with the enthusiast gamer looking for a laptop. Can't have it catching fire on you in the middle of the library...
 
" it seems the age of the heavy, chunky gaming laptop is ending" - you're so wrong. First of all, the 950m and 960m GPUs will not satisfy a true gamer (unless all they play is MMORPGs). Specs-wise, the 960m's 650 cuda cores are a little over half of the 1024 cores the GTX 960 (desktop version) uses - this should equal a little over 50% of the 960's perfromance. I for one am not happy with the performance of the desktop GTX 960 in recent games @ full HD (laggy, crawls to a halt if you enable quality textures and shadows due to the slim 128 bit bus) - the mobile version will be even slower.

As a gamer, I want to be able to enjoy content at as high a quality as I can, so I go for the fastest card I can afford. These cards will only fit into "bulky gaming laptops" due to power and cooling prerequisites. Furthermore, I'd rather own a large bulky laptop that can house a powerfull gaming configuration rather then a slow thin "gaming" laptop. As it is, my laptop sits on my desk 90% of the time - I don't even use the built in keyboard and monitor (and why would I when all manufacturers refuse to make anything bigger then 17,3").

Most "true" gamers don't run their games at full settings, many lower the quality to guarantee absolute performance and also reduce blur(motion blur and AA can make it harder to play).

A desktop GTX 960 can run every current game at 1080 maxed out, with few exceptions that force you to disable a few non-essential settings. This GTX 960M should satisfy nearly every gamer that is willing to use a thin laptop to game on.

I too believe the days of bulky laptops are coming to an end, it's about time.

The GTX 960 is very much on the low end of the cards, something only a person on a tight budget should even consider, though at that point, in the US at least the R9 285 or 280 make more sense. The 960 is barely better than mobile GPUs in some games, say like Arma 3 only 8fps difference at 1080p between it and a GTX 880M on ultra settings; with both below 60fps. For gamers who like to have the best visual experience on the go with playable frames I don't see bulky laptops going away anytime soon. Those who play casual games/harware light games will certainly benefit from going with smaller laptops, but it ain't gonna cut it for somebody like me. When I'm in some third world country for months I don't want a crappy eco friendly gpu, I want the most powerful hardware I can afford in that thing. As long as it fits in my ruck I don't care how 'bulky' it is.
 
Surprised no one has commented on the fact the 950 and 960 are just rebadges of the 850 and 860 with a slight clock boost and no kepler variant to muddy stuff. Honestly cant blame them this time tho considering Maxwell is still for the most part Maxwell and it makes the lineup easier to read.

+1!

This really needs to be mentioned in a launch article like this. The way the article reads, touting battery boost and shadowplay, it sounds like we're dealing with new features for the 9xxM series, when in reality this is a straight up re-badge. In fact, the 950M has a lower base clock speed than the 850M, despite both being a GM107. Plus, with a 950M on DDR3, either 950M could be a little slower (GDDR5) or significantly slower (DDR3) than the previous "generation" 850M.

I agree if they're launched at decent prices you can't really blame Nvidia too much and they're hardly the only ones who do this, but it absolutely needs to be called out in a launch article.
 

Takasis007

Honorable
Feb 3, 2015
79
2
10,640
"Well, compared to the new MacBook these are positively enormous."

It has just one connecting port for Power/Monitor etc, no slot for a data card of any kind and no ability to connect directly to a router. But they do make various adapters. (They are extra by the way) But pick which peripheral you use at which time. It would suck needing to transfer large files over WiFi when connecting directly is so much faster. It is one piece of equipment that is not on my shopping list. I honestly do not understand how only having one port would make anyone want to have anything to do with it. But this goes directly to the arrogance of Apple expecting their customers to buy it, and thank them for it. (Sorry for this but that thing just rubs me the wrong way.)
 

hitman400

Distinguished
Jul 24, 2012
91
0
18,630
"Well, compared to the new MacBook these are positively enormous."

It has just one connecting port for Power/Monitor etc, no slot for a data card of any kind and no ability to connect directly to a router. But they do make various adapters. (They are extra by the way) But pick which peripheral you use at which time. It would suck needing to transfer large files over WiFi when connecting directly is so much faster. It is one piece of equipment that is not on my shopping list. I honestly do not understand how only having one port would make anyone want to have anything to do with it. But this goes directly to the arrogance of Apple expecting their customers to buy it, and thank them for it. (Sorry for this but that thing just rubs me the wrong way.)

You don't see the bigger picture. Apple was one of the first to have a laptop that was not only powerful, but thin and have ventless bottoms using the case itself (aluminum) as a heatsink. If it wasn't for Apple, the NFC payment system wouldn't have been pushed as hard (since everyone wants to use it now that Apple is promoting it). Now, the single port does severely limit things like what you said, but the IDEA is there. Companies will take from what Apple has set forth and probably make a port that is HDMI/USB, etc. you know...common sense combinations. That would eliminate the hassle of having 20 different cords and I GUARANTEE you the future is about making things more simplier.

As for the 960M and 950M. Well..it's always nice to see a revolution in CPU's with the fanless Broadwell, nice to see that the 960M is just as revolutionary *sarcastic voice*
 
Its gonna take some time till games actually run well on light laptops.
I calculate 3-5 years.
First off, the battery is always a problem in this case. Unless battery technology jumps forward significantly (or basically we start using NOKIA batteries for everything... oh wait, its Microsoft not, nvm) it will always limit laptops to at least be connected to a socket wall.

This requires GPUs that will run like a nVidia 980 but consume 10W.
That will take time indeed.
 

f-14

Distinguished
Anyone know what the actual difference is with the memory interface? In real world performance?
GBps compounding thruput by mhz

"Well, compared to the new MacBook these are positively enormous."
mac's don't game " . "
have fun with clash of clans and angry birds tho...you do know you can play that on your Dpad and Dphone. how are you macfans enjoying those recently released 2003 era games like conquer online, i bet you feel all rosy now that you're able to play something, even if it's early 2000 obsoleteness.
how's that thunderbolt working out for the new macbooks...oh wait....yaaaaa....get back to filling out your TPS report.
 

f-14

Distinguished
Anyone know what the actual difference is with the memory interface? In real world performance?
sorry forgot some numbers, was too busy scoffing at the non gaming computer (not even crysis!) iTard comment

GDDR5 SGRAM conforms to the standards which were set out in the GDDR5 specification by the JEDEC. SGRAM is single-ported. However, it can open two memory pages at once, which simulates the dual-port nature of other VRAM technologies. It uses an 8n-prefetch architecture and DDR interface to achieve high performance operation and can be configured to operate in ×32 mode or ×16 (clamshell) mode which is detected during device initialization. The GDDR5 interface transfers two 32-bit wide data words per write clock (WCK) cycle to/from the I/O pins. Corresponding to the 8n-prefetch, a single write or read access consists of a 256-bit wide two CK clock cycle data transfer at the internal memory core and eight corresponding 32-bit wide one-half WCK clock cycle data transfers at the I/O pins.

GDDR5 operates with two different clock types. A differential command clock (CK) as a reference for address and command inputs, and a forwarded differential write clock (WCK) as a reference for data reads and writes, that runs at twice the CK frequency. Being more precise, the GDDR5 SGRAM uses a total of three clocks: two write clocks associated with two bytes (WCK01 and WCK23) and a single command clock (CK). Taking a GDDR5 with 5 Gbit/s data rate per pin as an example, the CK clock runs with 1.25 GHz and both WCK clocks at 2.5 GHz. The CK and WCKs are phase aligned during the initialization and training sequence. This alignment allows read and write access with minimum latency.

A single 32-bit GDDR5 chip has about 67 signal pins and the rest are power and grounds in the 170 BGA package.

§Commercial implementation[edit]
In 2007, Qimonda, a spin-off of Infineon, demonstrated and sampled GDDR5,[2] and released a paper about the technologies behind GDDR5.[3] As of May 10, 2008, Qimonda announced volume production of 512 MB GDDR5 modules rated at 3.6 Gbps (900 MHz), 4.0 Gbps (1 GHz), and 4.5 Gbps (1.125 GHz).[4]

Hynix Semiconductor introduced the industry's first 60nm class 1 Gb GDDR5 memory in 2007. It supported a bandwidth of 20 GB/s on a 32-bit bus, which enables memory configurations of 1 GB at 160 GB/s with only 8 circuits on a 256-bit bus. The following year, in 2008, Hynix bested this technology with its 50nm class 1 Gb GDDR5 memory. Hynix 40nm class 2 Gb GDDR5 was released in 2010. It operates at 7 GHz effective clock-speed and processes up to 28 GB/s.[5][6] 2 Gb GDDR5 memory chips will enable graphics cards with 2 GB or more of onboard memory with 224 GB/s or higher peak bandwidth. 4 Gb density GDDR5 modules first starting becoming available in the third quarter of 2013. Initially released by Hynix, Micron Technology quickly followed up with their implementation releasing in 2014. Modules from both Micron and Hynix, of both the 2 Gb and 4 Gb densities, range in bit rate up to 7 Gbps.[7][8][9]

On June 25, 2008, AMD became the first company to ship products using GDDR5 memory with its Radeon HD 4870 video card series, incorporating Qimonda's 512 MB memory modules at 3.6 Gbps bandwidth.[10][11]

On February 20, 2013, it was announced that the PlayStation 4 will use sixteen 4 Gb (i.e. sixteen 512 MB) GDDR5 memory chips for 8 GB of GDDR5 @ 176 Gbps (CK 1.375 GHz and WCK 2.75 GHz) as combined system and graphics RAM for use with its AMD-powered System on a chip comprising 8 Jaguar cores, 1152 GCN shader processors and AMD TrueAudio.[12]

As of January 15, 2015, Samsung Electronics announced in a press release that it had begun mass production of 8 Gb density GDDR5 memory chips based on a 20nm fabrication process. To meet the demand of higher resolution displays (such as 4K) becoming more mainstream, higher density chips are required in order to facilitate larger frame buffers for graphically intensive computation, namely PC gaming and other 3D rendering. Increased bandwidth of the new high-density modules equates to 8 Gbps per pin x 170 pins on the BGA package x 32-bits per I/O cycle, or 256 Gbps effective bandwidth per chip. [13]

judging by best guess based upon memory power specifications of previous generations ddr2 1.8-2.0-2.1volt and ddr3 1.5-1.65v ddr5 should be under the ddr4 spec of 1.2volt so i'm going to hazard 1volt to 1.1volt gddr5 which is roughly a 1/3 reduction in power use which is something considered oh so critical in the mobile world.

the 256bit limiting factor explains why we are getting all this 128bit stuff. we're being rationed with tid bit increases most likely due to pricing issues the laptop manufacturers don't want to pay anything but the lowest rock bottom pricing on for parts and i don't know how much the .3-.5 volts is going to affect power consumption, but i suspect merely 30 minutes at most, and that's not a big enough concern to increase costs by 30% on buying a gddr5 256bit gpu
 

Urzu1000

Distinguished
Dec 24, 2013
415
10
18,815


I greatly appreciate the detailed response that you've provided me. I had pretty much given up on getting a reply for that, let alone a helpful one!

Thanks!
 
Status
Not open for further replies.