Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 27 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


so far they have been limited by the 8 pin power connection more than anything else. think 215 w is about the most i have seen since it hits thermal limits as well before it can max the 225w available to it. this is with the stock cooler of course. who knows what the custom ones will be able to handle. the few DYI custom coolers we have linked to here, show much better temps and closer to theoretical 225w power limit.

remember it can only give 120% of the power limit right now which is 216w max
 
no biggie :) Lots of people who have little choice but to go with laptops due to space restrictions. i prefer desktop as well with my nice big screen but for instance, when i was deployed to a variety of war zones with the army, a desktop was not an option. i got the strongest laptop i could find so i could do what i wanted to while living in my allotted 4 x 8ft space (this included a bunk bed for 2 of us to share said space). this gpu in a strong VR ready laptop would be awesome for a user with limited space (dorm room living anyone?)

but the power of the 1080M does sound like VR should be possible on a laptop with this gpu. that's a big step forward for laptop users.
 
Yeah, my oldest Steam buddy uses a gamer laptop. I used to love my Acer Aspire 8920 18.4" 1080p screen with a 9600M GT. (I still have it but the graphics card died.) I actually thought 18.4" was big and all I would ever need. Now I own a 22" IPS, and vaguely have my sights set on a g-sync monitor. The Acer predator IPS 27" is about the only option in IPS. Therefor I will wait until IPS become more common-place in g-sync. Maybe a Dell with their zero dead pixel policy.

A 1080M would be a significant upgrade for laptop users I guess. No reason to think it will not double 980M performance, as the desktop version does. Exciting times.
 


you are right, by the time Volta comes 4k probably won't be mainstream, but I predict that by 2018 1/3 of steam users will be playing at 1440p, by 2022 I think 1/3 of users will be playing in 4k, I only say this because when it comes to technology things progress exponentially. This however is just my prediction.

But what comes after 4k..? it wouldn't be 5k or 8k... after a certain amount your eyes can't tell the difference.... I mean hell, on my droid turbo I have a 1440p screen, when I watch youtube on my phone and switch from 1080p to 360p, I can hardly tell the difference. And at 480p on my phone I really can't differentiate between 1080p and 480p, I realize this is because my droid turbo has a 5.2" screen and 565 pixels per inch, but IMO after we reach 4k, or at least whatever resolution at 16:9 comes after 4k, it would be a waste of resources and computing power to reach higher resolutions.

As far as VR headsets go..... I don't think anyone gains anything after 4k, your eyes can't tell the difference, after 4k I think more work will be done on better panels for LCD, panels that are better than IPS displays with high dynamic range, and 16 bit color, things like that.

but you are right 17seconds, I do like to dream big :)

I really do think hbm3 will be out by 2020 though, and flagship GPU's will be sooooooo much smaller than the huge flagship cards of today. I also think sli and crossfire will be obsolete in a couple years as well.
 


i feel we are already approving that limit. 4k is already a very small improvement on any screen smaller than 27" and that wont change so unless they become very cheap and the hardware to run it well becomes mainstream i dought we will ever get 1/3 of the users on 4k
 
looking around and finally get what i'm looking for. it seems that with board power limited around 225w the clock probably will not going to get any higher than 2.1Ghz at most. so for those that using reference board paired with water block hoping to get much more crazy OC can drop the idea.
 


Yep. It will be interesting to see just how high they can be pushed with extra power.
 


I know we will, technology follows three rules: It gets smaller, it gets cheaper to manufacture and it gets more powerful and efficient. in the mid 20th century the cost of a computer was millions of dollars, took up a whole room, and wasn't very powerful, today a basic scientific calculator has more RAM than the Apollo 11 lunar lander.

The integrated circuit isn't the only thing that Moore's law predicts; it predicts how many cells we can fit onto a solar panel, it predicts the number of megapixels on cameras and displays, and much more. Now as far as the integrated circuit is concerned, Moore's law does have its limits and it's running out of steam, however, it's only running out of steam when it comes to the lithography of chips, basically once we reach around 8nm chips quantum tunneling will start to become a problem, but the cost to manufacture an 8nm chip will be cut in half every 18 - 24 months, in ten years chips are going to be so powerful and cheap to produce they are going to be pretty much everywhere, and all connected to the internet, this is how we will finally reach self driving cars (among other things), there will be chips along all the streets and highways coordinating traffic with the chips in cars.

I mean hell, I remember when 1080p was the new thing, a 30" LCD television was like over 3500 bucks, now I can go down to Best Buy and get one that's ten times better for under 300 dollars. 1440p and 4K will become mainstream faster than you think.

And there will always be other ways to improve chip designs other than increasing the transistor count on a chip, for all we know engineers might be able to use quantum tunneling to improve chips, soon we are going to start using chips that use light instead of electricity, there's quantum computing. We've already found other ways with technologies like FinFet, and FinFet can still be improved upon as well. Technology will always improve.

If you can't tell, I'm a pretty firm believer in the technological singularity 😀
 


 
I think it's going to come down to the dx12 products available, honestly. While one of the very few combinations I've never tried is your 760 SLI setup I'm running a 2 SLI 980 Ti in one of mine, and a 2 SLI 970 in another, but it's like 4 wheel drive...more parts mean higher power requirements and more things to break. I'm betting we see "just enough" from the 1080 to make the performance switch to one card at a price point far lower than expected. But one of my biggest regrets is buying the Titan when it first came out. And to be honest, none of us "needs" a 1080. Hell, a single 970 hardly slows even at 4k & high float point renderings in the real world. A slightly jaded i5 4690k and a GTX 970, when built right, is all 98% of the populace could ever use.
 


When we reach the resolution after 4k we will go for refresh rate 200hz 😉

 
the other thing to consider is we are the minority when it comes to computer tech. the average computer user buys a pc and uses it FOREVER and when it finally stops being useful they will go buy a new one. for non gamers, this could be a 10 yr old machine that still surfs the web and checks email perfectly fine. these people are finally upgrading and moving to win 10 and 1080p screens for the most part as it is the current "norm". they never even saw anything since win xp and go straight to win 10, and marvel at how different it is. imagine going from xp to win 10 and getting cortana, maps and all those other neat apps.

they won't think about upgrading again for another 10 years until that HP all-in-one they just bought is finally worthless again. even a casual gamer is not really up to date on current hardware trends and probably does not even know what "pascal" is other than a triangle they vaguely remember from high school math class. these people may update sooner but still not as often nor care as much as WE do. so until the average pc is at 4k, i doubt we'll see it become the standard.

many people ask me for custom builds and have no idea what they want. only that "the kids want to be able to play the new games" they get their pc at their budget and don't even ask nor care what's in it. if it needs an update, they bring it back and i do it for them. that's the normal pc user from my experience.
 


Pythagoras...
😉
 
yah i have a ton of things going on right now. but i work from home and have my pc tuned in most of the time. so pretty much every time i walk by the pc i can look real quick and see if anything new is going on 😀

i am also in a bunch of other threads right now as well. it's my mental break from everything else i got going on .......
 
Just returned my GTX 970 and am running on the on board graphics card (Intel HD 530) WOO HOO! GTX 1080 here I come... The custom versions, because I don't believe in the Founders version. Hahaha.
 


Your so right man, the next leap in technology seems so surreal but every year we are a step ahead of that generation of hardware, although I think moores law was prepared for years ago, clever marketing and incrementing products into segments of architecture (ie pascal) maximizes profits and target audience.

I think the ideas of future gen technology has been engineered and tested in labs for a while now and are only waiting for the market to reach it (like 8k), which is a shame but thats just the way things are, what we need is more competition to drive the market and utilize the innovating technology. 😛
 


Given enough hardware to drive it, and appropriate screen...
Could your eyeballs tell the difference between 4k and 8k in a 60" screen at 15 feet?

Similarly, 4k on a 5" phone screen. Can you actually tell the difference?

We can update the hardware all we want. Can't upgrade the Mark 1 MOD 0 eyeballs.