Nvidia Intros New GeForce 700M Mobile GPU Series

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
So if you install the experience software, NVidia will no longer let you increase quality settings beyond what your video card can handle. In doing this you will never know how good your games should really look, just how good NVidia thinks you should see them. Sounds great, where do I sign up... /sarcasm.

Enthusiasts stay away from gaming laptops because of all the marketing lies. Intel is not nearly as guilty as AMD/NVidia when it comes to this, but then again, their video chips suck, so no enthusiast cares if they lie, its still crap no matter how you spin it.

I wish the industry would just adopt a simple naming strategy....

If you perform just a die shrink / power decrease, add an "E" for efficient, there must be a part with similar clocks and like hardware that uses more power for an "E" chip to be badged.
If you increase clock speed, add an "X" for Xtreme. There must be a part that has the same hardware and lower clocks for an "X" chip to be badged.
Software updates (optimus, gpu boot 2, etc..) need to be done via a BIOS versioning 77x
Additional compute core's, texture units, memory bandwidth will differentiate the sub versioning 7x0.
When you create new silicon that adds additional instruction sets, modifies the compute pipeline length, or modifies the connection type (PCIE 1v2v3,) primary versioning can be modified x70.

To bad it will never happen...
 
[citation][nom]A Bad Day[/nom]EDIT: BTW, Nividia and AMD are both guilty of the rebranding."Dude, what do you mean the 610m is actually a slightly higher clocked 520m? And that the 420m creams both 610m and 520m? Isn't 610 is higher than 420?"[/citation]

420 heh heh, not that i'm a pot head, but...even i get this one, somebody april fooled you and you didn't catch on i think.
 
[citation][nom]dalethepcman[/nom]So if you install the experience software, NVidia will no longer let you increase quality settings beyond what your video card can handle. In doing this you will never know how good your games should really look, just how good NVidia thinks you should see them. Sounds great, where do I sign up... /sarcasm.Enthusiasts stay away from gaming laptops because of all the marketing lies. Intel is not nearly as guilty as AMD/NVidia when it comes to this, but then again, their video chips suck, so no enthusiast cares if they lie, its still crap no matter how you spin it.I wish the industry would just adopt a simple naming strategy....If you perform just a die shrink / power decrease, add an "E" for efficient, there must be a part with similar clocks and like hardware that uses more power for an "E" chip to be badged.If you increase clock speed, add an "X" for Xtreme. There must be a part that has the same hardware and lower clocks for an "X" chip to be badged.Software updates (optimus, gpu boot 2, etc..) need to be done via a BIOS versioning 77xAdditional compute core's, texture units, memory bandwidth will differentiate the sub versioning 7x0.When you create new silicon that adds additional instruction sets, modifies the compute pipeline length, or modifies the connection type (PCIE 1v2v3,) primary versioning can be modified x70.To bad it will never happen...[/citation]

Intel is far worse with their integrated graphics. Even with the same model number, their integrated graphics can have a wide performance range. That's like having some Radeon 7970s performing like the top Radeon 5870s while the top Radeon 7970s perform like GTX Titan. It's a huge mess of trying to figure out what can do what.
 
[citation][nom]blazorthon[/nom]Intel is far worse with their integrated graphics. Even with the same model number, their integrated graphics can have a wide performance range. That's like having some Radeon 7970s performing like the top Radeon 5870s while the top Radeon 7970s perform like GTX Titan. It's a huge mess of trying to figure out what can do what.[/citation]

In me experiences, every Intel chip with the same model performs exactly the same and trying to figure out what they can and can't do has never really been an issue... They can't do playable 3D above 1024x768, they can't do anything with lots of shadows and lighting, they can't do msaa/af, They can't do hardware physics, etc, etc, etc....
 
Status
Not open for further replies.

Latest posts