Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 118 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

axlrose

Distinguished
Jun 11, 2008
1,929
3
19,815
Was just reading through that this morning too. My tower is a Lian Li x900, which is extremely shallow for gpu's now. Wasn't an issue when I purchased my 560ti's way back. Funny to read about the cards that are lauded as being shorter than most and seeing they don't fit in my case, even before I add a water cooling setup on the front end. :)
 

Math Geek

Titan
Ambassador
yah some of the cards are silly long. and others are way too tall. then again, there are some that are super long, tall and fat!!! makes the true double slot cards like evga's look almost small by comparison
 


while 1080 indeed expensive i don't really think you can say the same for 1070. in previous generation you need to at least cough up $650 to get that level of performance. now with this generation such performance has been drop down to $400-$450 mark. that's defintely a welcome improvement.

also while cards like 480/1060 perform quite well in majority of games most of them were old games. both card are considered as a mid ranger this gen. while they are fine running some old games but it might not be necessarily the case for future games. watch dogs 2, dishonored 2, and deus ex mankind divided for example really show that both RX480/1060 is a mid range card when the card cannot max out those game at 1080p res.

https://www.techpowerup.com/reviews/Performance_Analysis/Watch_Dogs_2/4.html
https://www.techpowerup.com/reviews/Performance_Analysis/Dishonored_2/4.html
https://www.techpowerup.com/reviews/Performance_Analysis/Deus_Ex_Mankind_Divided/5.html
 

Math Geek

Titan
Ambassador
is why i said that so long as you can handle not having maxed settings......

those settings are the best thing game makers ever added to games. makes folks feel like they are missing out if there is another graphics level to use. it's just not that important overall. just like those people who swear they can hear the difference between a 128 kb mp3 vs a 320 kb one. it's just a pissing contest that just makes no massive difference.

but keep spending the big cash for that extra graphics level and keep telling yourself you're a better person for it :D
 


I can distinguish them perfectly. From 256Kbps to 320Kbps is harder, but I can still tell which one is which. Listening to Trance in higher quality does make a difference. I don't know about FLAC/ALAC though.

I do agree some settings are *very* hard to perceive when dialing up the quality settings, but even if for people is a placebo effect, they are not going to be calm until they can dial them up to what they want and nVidia knows that.

Just like we have Fiios, amplifiers, fancy DACs and $3K headphones, we'll have expensive video cards :p

Cheers! xD
 

Math Geek

Titan
Ambassador
of course you can so long as you know the bitrate before listening to it :p

but anyway, like every industry, there is a lot of money to be made so long as you can convince folks there is a difference between 1 thing and another. folks pay thousands for a bottle of wine, yet can't tell the difference in a blind test over a $3 bottle from the grocery store. folks pay tons of money for beats headphones, only to get the same inner workings as a $20 walmart set, yet they will swear they can "hear" the difference. folks pay a lot for "high" settings in a game just to swear there is a difference. and folks drop paychecks on an expensive watch that tells the time no differently than a cheap timex.

sadly the one thing you can really tell the difference with folks won't spend the extra cash for. they'll drink the cheapest swill beer they can get their hands on despite it tasting like week old urine filtered through a muddy, moldy work boot.

go figure. but hey keep spending big on that placebo, they need the cash to develop the next perfectly acceptable mid-range card for the rest of us to enjoy :D
 

TehPenguin

Honorable
May 12, 2016
711
0
11,060
128kbs vs 320kbs is a huge leap, tbh. Beyond 320kb is placebo region for me and FLAC are a waste of space but 128kbs vs 320kbs is comparable to 480p and 1080p, both acceptable but the other one has a clear advantage.

Anyway, I don't see why getting 50 or 57 FPS in a new game on max settings is a bad thing. IMO the 1060 holds nicely in that regard. 30 FPS would be bad, but 50+ is great.
 

Math Geek

Titan
Ambassador
nvidia always has such great timing. nothing like taking the wind out of the sails of amd time after time. i'm still excited abut Vega but how close a 1080ti will be to the titan X is always an interesting question
 

TehPenguin

Honorable
May 12, 2016
711
0
11,060
IF they repeat what they did with the 9th series, where the xx80ti replaced the xx80, pricewise, then the 1080ti will be my next card. I'm afraid nVidia is going for the extra dollar, though.
 

Math Geek

Titan
Ambassador
i'm expecting around $950 at release. since amd has nothing to compete, they have no incentive to keep it down. figure with 1080 at $700 and titan at $1200. $900-950 makes sense. might be cheaper for custom cards but i doubt we'll see anything below $900 or so.

just my guess though
 


I tried that ABX 'double blind' test with most bit-rates from 32kbps though to 320 using mp3 vs WAV.

With a classical music sample I could just spot the difference at 128kbps (high frequencies showed some artefacts) but not with a techno/trance sample. That had to be down to 112kbps.

After that I decided to encode all my music at ABR ~128kps!

Is it the same with graphics? Nearly. High or Very High vs. Ultra makes scant difference to me nowadys. :??:
 


One last point to make about lossy audio compression. I can notice the differences, and other people as well, because the headphones and/or speakers can actually make it noticeable. Particularly, I have studio monitors and studio headphones with me. Although driven by the on-board sound card, the differences are still perceptible.

That being said, I would imagine higher settings in video cards are better appreciated in higher quality monitors? I can vouch for 60Hz -> 120Hz+, but that is not about visual quality. Maybe at higher resolutions you can indeed notice better eye candy effects? I can't remember that when I moved from my old 1280x1024 monitor to 1080p, but I can say the bigger area was appreciated :p

Cheers!
 

Math Geek

Titan
Ambassador
i won't be buying a 1080ti either way, but it is always fun to speculate :) i am looking at a 1070 but now gonna wait for vega just to see what they are all about. i support amd personally and have not owned an nvidia card in a long time. so if vega is close to 1080 and priced well, i may get one to add to amd's bank account.

not that i need anything even close to that performance level, but figure i'm old enough now and make enough money that i can splurge on an expensive toy i don't need just cause :)
 

Math Geek

Titan
Ambassador
oh yah will be 2 days before vega hits. they love to time it like that. for me though it won't effect my buying decision other than what it may do for pricing. it'll be way more than i plan on spending. i'm figuring vega will be less than a 1080 for sure. if it's priced right with it's performance level, i'll go that way for sure. i'm leaning toward a 1070 so it has to compete with price/performance with a 1070 for my simple choice.

vega will cost more i am sure of that but if it's performance is enough to justify the price increase, i'll get one. otherwise the 1070 ftw card is still high on my list. either one will have to like watercooling as i still intend to go stupidly crazy with a custom watercooled set-up.
 
Head 2 Head against GTX 1070, bet that. But i really want to see how they perform in CF. Should GTX 1080Ti, TITAN X (Pascal) be worried?

But men, i really hope the RX 490 combats not only in performance against the 1070, also with price range.
 
How many FP32 FLOPS was the Vega Pro card they announced? ~13TFLOPS? How many FP32 FLOPS is Titan X Pascal? 11TFLOPS? How do the current gens compare in terms of FLOPS to gaming? RX480 is ~5.5TFLOPs and the 1060 6GB is ~4TFLOPS?

So, if that is the case, then it is going to sit just a bit under the Titan X-P in gaming. I'd say that is good news, right? Then that might also mean, nVidia will release the 1080ti (which I believe they are stocking like crazy for the time being XD) right next to Vega.

This is very simplistic and I'm missing a lot of info (ROPs being the most important IMO), but should give us enough information to go around speculating where Vega might/could land, right?

Cheers!
 

king3pj

Distinguished


I completely agree. I have a 1440p 144Hz G-Sync monitor and an Oculus Rift. I upgraded from a 970 to a 1070 because I thought the $400 I spent for it was a great value considering that it beats the $650 980 Ti and $1000 Titan X from the generation before. I also considered the 1080 but after looking at all the benchmarks I could find I decided that the 1070 was strong enough that I didn't need to spend the extra $200+.

If AMD would have had a 490 to compete with Nvidia in both price and performance I would have considered buying a FreeSync monitor and going with them instead. However, my purchases are purely based on performance and value. AMD is not a charity. They are a giant corporation just like Nvidia. They aren't going to get my $400 just because I feel bad that Nvidia dominates them in market share. They have to earn it.

I'm no Nvidia fanboy either. Before my 970, my previous GPU was a Radeon 7850.
 

king3pj

Distinguished


If you just get a standard monitor of any resolution or refresh rate it doesn't matter if you go with Nvidia or AMD. If you want one of the fancy monitors that constantly changes its refresh rate to match the exact framerate your GPU is putting out it does matter.

I really like my G-Sync monitor because my 1070 doesn't put out 144 FPS at 1440p in most modern games. G-Sync keeps everything smooth and stutter free with no tearing whether I'm getting, 50, 60, 100, 144 FPS or anything in between. FreeSync monitors do the same for AMD users.

Since I already invested in a nice G-Sync monitor I'll be sticking with Nvidia GPUs for the foreseeable future. I could use an AMD card and still get the 144Hz out of my monitor but I would lose G-Sync. By AMD not having competition available for the 1070 they have basically locked me out of using their GPUs in the future even if they have the stronger card.