• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 141 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
meaning this probably the biggest changes to GPU since unified shader architecture being introduce to us back in 2007. if anything other GPU maker probably will be forced to build their GPU in similar way as nvidia.
 
That's possible. This is big news! The new cards will save movie companies lots of money.

Cheaper hardware, less energy cost and faster work flow.

My guess is it will take some time for Game Developers to utilize the new features of the cards but when they do it will be Amazing!
 
the effectiveness of Turing in handling hybrid rendering between rasterization and ray tracing was supposed to accelerate the adoption of ray tracing in games. Although the industry as a whole still need to adopt similar approach as nvidia mainly AMD and intel before all game developer will go all in with RT. Console maker definitley interested with this tech especially sony. So at the very least AMD most likely have to adopt something similar to nvidia Turing approach in handing RT to keep the semicustom win.
 


Yes, I think it will take some time however if the rumors are true and the next card is going to be launched on August 20th & we will find out.



Can you imagine games like the star wars demo?
 


Before if I understand correctly they were baked into the textures however now they will be done in real time!
 
Not much good news about the 20-series cards yet.

Was thinking they will probably be just as good for cryptomining as the previous generations.

Also they are likely to get bought up by graphics designers for the ray tracing properties. I read that until now graphics designers who want to process ray tracing are doing it about five frames per day. (Might not be realistic, but the idea is worrying.)

Hence the high prices I guess.

 
According to this article, Coco wound up taking about 50 CPU hours per frame to render for the light heavy sequences once they finished optimizing.

https://renderman.pixar.com/stories/the-world-of-coco

Of course, their first pass at it took over 1000 CPU hours per frame, then 450 after implementing new culling schemes and certain cheats that would treat multiple light sources as the same source. Then it went down to 125, then 75, and finally 50 as they custom tuned the software.
 
wow, that's a ton of cpu time for one frame!!

i do doubt though that this first gen ray tracing is somehow gonna lower that to 10 frames per cycle or any of the other super optimistic things i'm seeing folks say is gonna happen. some new eye candy won't hurt but i seriously doubt that we're gonna see movie quality gaming in 4k at 100 fps with these new cards like i'm reading all over the place.
 
That's actually a 1 year chart. The ~80% figure is comparing the current price to the one year high back in December. It's definitely fallen, but it did so over the course of 9 months (after a ridiculous bubble) not all today.
 
prices have fallen a lot last couple weeks already. not sure how much lower they gonna go but with the crypto dropping and coming 2000 series, they've come back down to earth in price for sure.

may not be a bad time to pick up a quality 1070/80 if you're not buying into the ray tracing gonna change the world overnight hype. can wait until it either takes off or falls to the side to upgrade down the line.
 
Looks like this thread has slowly come to a halt. ... However I want to say, that I can not believe the price that Nvidia are asking for 20 series cards.

Like the 1080 Ti was £800 new. The 2080 Ti is £1200. Ray tracing however is almost non-existent in games.

Since I run at 1440p, I can not see me buying a 2080 Ti or 3080 Ti. Buying my 1080 Ti turned out to have been a good idea. At the time, I was not sure, as I did not buy it when 1080 Ti was new. I bought it fairly late on, and thought maybe I was not such a good idea. Then Bam graphics cards vanished due to mining. ... At the time I had a 980, and it was struggling a bit at 1080p sometimes. (Max settings though.)
 
I suppose it really is a non-event for most people that already had high end Pascal cards. Except maybe anyone with a 1070 looking to go up a notch with a new monitor. Even then, the 1080Ti would be a good choice over the 2080.

The pricing doesn't surprise me, more or less what I expected. Only so much they can do to make such large chips on an existing process node. I'm hoping to hold out for the next shrink. The only sad part is it will re-adjust the market towards higher prices. I did a little number crunching and compared inflation and performance. It was quite revealing. I'm surprised they've held back on price increases as much as they have.

I still have my GTX 980s sitting on my desk. Couldn't even unload them during the mining craze thanks to the 1060s, and now thanks to there being tons of used 1060s on the market. Just like with my 580s, probably sell them off when they are nearly worthless to someone on an extreme budget.

 


Can definitely hear the pain in your post.

Was checking again recently for cards, just to see what's what. Still only a very few 1080 Ti on sale, and barely any 2018 Ti, or 2080.

There was supposed to be a lot of Pascal GPUs already ordered by companies. They are legally supposed to be upholding that deal. However from that you'd expect lots of Pascal cards but still no.

The price of the new cards is the biggest shock to add to gamers pain though. Like the Asus 2080 Ti is £1300. That's not even in realistic terminology.

I freaked when I spent £730 on my MSI 1080 Ti, and felt like I got a good price. I only paid £400 for my GTX 980, so spending a lot is not normal to me.

1080 Ti are so rare that finding an MSI to maybe go SLI is almost impossible too. I always doubted I would even try SLI. That due to complications, and keeping the cards cool being so close together.
 
the availability issues are what makes me laugh so hard everytime i see a new thread here asking "are the new 2080's gonna be super discounted for black friday/cyber monday/christams??"

such a stupid question when you consider one can't be found for the normal price....
 
I think being a 1080 Ti owner, I could skip the RTX 2000 generation of cards. The worry is that card prices will continue to rise or cards become even rarer.

I think as it will take time for ray-tracing to be adopted, is it worth it now. I am really not sure buying RTX now is even a good idea. I think for gamers at 1080p, the 1080 Ti is still very powerful. Whereas I recently moved to 1440p and am worried, ahaha!
 
You'd think with the latest crashes in cryptocurrency, that cards would be available. Plus lots of used ones. There are however a lot of used ones actually turning up on Amazon UK. I would not advise buying them through because they will have been hammered.

The worrying trend is that new cards are just not coming into stock on Amazon UK. There's barely a 1080 Ti or 208o Ti to be had. I think there was one PNY across both cards types. What are people supposed to do?

By the way please, is there a 2000 series megathread like this 1000 series thread?
 
i don't think there is a 2000 series thread like this. i didn't see anyone ask for volunteers to make one anyway and it's not stickied in this gpu subforum.

i enjoyed making it and all but RL got in the way and i was not able to put the time in to keep it up. it's a lot of work to track down all the info contained in the first few posts and not sure many have the time to do it on a voluntary basis.

if you're looking for card info itself, check out this awesome chart being done by raisonjohn. pretty much what i was doing but larger in scope!! major shout out to him for taking on such a task.

http://www.tomshardware.com/forum/id-3560418/gpu-amd-nvidia-sortable-comparison-tables.html
 
Math Geek, actually we do have a 2000 series megathread, it's been merged with the Volta megathread since both are so similar and since Volta didn't exactly have a consumer launch with consumer products:

http://www.tomshardware.com/forum/id-3771988/nvidia-turing-volta-megathread/