Nvidia Turing & Volta MegaThread

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
finally the architecture detail is coming out. to me this is more interesting that seeing the RTX series performance in games haha.

some of TPU summary

Raytracing adoption will also heavily depend on developers and publishers. It's great to see several API standards for raytracing, backed by all the major industry players. In my opinion the deciding factor will be adoption in consoles, though. Most titles today are developed for consoles first, which run on AMD hardware, that have no support for raytracing at this time. No doubt AMD, Sony and Microsoft, are working hard to bring raytracing capabilities to consoles, which would intensify developer effort and convince publishers to allocate significant money and development time for these features.

AMD probably did not like nvidia to lead what next graphical feature to be adopted by game developer but MS and Sony probably will force AMD to come up with something similar to nvidia RTX (especially sony).

The shader engine has been upgraded to concurrently support execution of integer and floating point instructions, which will give a nice boost in every single game, without the game developer having to make any code changes.

hmm typical nvidia? in general i think this is the main principle for nvidia when they design their GPU. design a feature that easy to take advantage of by game developer to the point developer does not need to support anything specific to reap the benefit. it reminds me of AMD and nvidia approach in increasing GPU utilization in games. AMD push async compute to increase their utilization while nvidia make more significant architecture changes to improve architecture overall performance (fropm kepler to maxwell). the end result was AMD specifically need low level API (Mantle/DX12/Vulkan) before they can use async compute while pretty much any existing games can reap the benefit from maxwell design without the need to support specific API.
 
Now that I've had some time to really think about this it's starting to make more sense. If you look at it from Nvidia's perspective, rasterization is nearing EOL as a high margin product. For any gamer on 1080p, a 1070 or even 1060 is basically all you need to guarantee 60fps. For 144hz monitors, you might be able to justify going as high as a 1080 ti. But there's no reason for them to upgrade beyond that whether the 2080 ti is 30% faster or 300% faster. Even if you're talking 4k high refresh, if the 2080 ti had devoted all the extra silicon to CUDA cores it would probably be the last card anyone would ever need. At most, maybe one more generation past that. In order to keep selling new generations of high end cards, they had to offer something other than higher fps because it's approaching a dead end as budget cards edge closer and closer to acceptable performance
 
That is the correct take for the RTX generation so far: early adoption tech.

For the "bleeding edge nerd", the RTX will find a home right away, but for the rest of us "enthusiast plebs" it just doesn't justify it's price hike/increase over features zero games before 2018 will include and 1080p@144Hz is still good enough with 1060-6GB / RX480 level of GPUs. I'd even dare saying performance, not using the new graphical bling bling, it's a linear power/FPS increase making the efficiency juuust slightly higher, if not non-existent.

I also believe we need more information on DLSS. I just have a weird feeling about it.

Overall, the original impressions still stand... Faster, but needs more oomph.

Cheers!
 
Moving away from pure high FPS....maybe there is some truth in that. Looking at the result at TPU it seems even at 1440p we see CPU bottleneck for 2080ti. The gap between 2080 and 2080ti widen a bit at 4k. And that is running on 8700K @4.8Ghz! Any faster there probably no CPU can keep up with the GPU. And from my very own experience CPU bottleneck can give you negative performance impact to the point the supposed faster GPU end up being slower than slower GPU. It is one of the worse kind of bottleneck.
 

mjbn1977

Distinguished
I noticed a very interesting trend. People who currently own a GTX10series are very skeptical of the new RTX cards, and people who own older tech are much more euphoric. Wanna bet that when in one year from now the RTX21 or RTX30 series are introduced you will see the same skepticism from the RTX20 series buyers and more positivism from GTX10 series owners? just wanted to share this observation. :)
 

mjbn1977

Distinguished


Yes, I read and watched a lot of reviews the last view days and several reviews confirm that even with an overclocked 8700k you can get into CPU bottle necks in some situations and games with the RTX2080Ti while playing in 1440p. It really looks like that you only want to get the RTX2080Ti if you exclusively going to play in 4k. For 1440p the RTX2080 seems to be the better choice.
 

phenomk90

Honorable
Sep 29, 2018
97
2
10,545
Damn it on those prices. Initially i wanted to get a RTX 2080. But damn it even the cheapest one costs around $867. While the cheapest RTX 2080 Ti costs around $1297 in my country. Those crazy pricing forces me to settle for $670 GTX 1080 Ti instead
 
know i kind of understand why nvidia going back with one design for both gaming and compute with turing instead of separating them like what happen with kepler/maxwell and compute pascal (GP100)/ gaming pascal (the rest of other chip).
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
I think that the 2000 series cards are being bought up mby miners just as fast as any cards. They are reported to be sold out on pre-order.

Also it's possible that graphics designers will want to buy them too. That being to take advantage of Ray Tracing which takes a long time by any other method.

What's really bad is that Nvidia are charging for the 2080 Ti, 50% more. Yet it performs maybe 25% better at most. Their argument is that Ray Tracing is the reason. However Rat Tracing will slow games down a lot. They said they were aiming for 60 fps in Battlefield 5 at 1080p. What's the point then? ... Battlefield V otehrwise would not even be hard for the GPU to drive, without RT.

The real stickler is that they charge a lot more for performance increase like Pascal over maxwell. Yet they did not charge less when they had problems, like 600 series cards locking up on desktop. Or way beck when the 8600 card cores were failing.

£1200 for a top end card is, taking the mick.
 

Eximo

Titan
Ambassador
I've tossed this topic around for a while, but GPU prices were surprisingly flat, if not in decline, for the last several generations if you exclude the mining boom. If you look at inflation, prices should have been going up, about to where Nvidia is selling the RTX cards now. If you compound that with the size of the GPUs they are making and what they can do, it is almost reasonable. Clearly the market can bear it as well.

I tried building up some examples but I ran into price history difficulties, not sure I trust what I am finding. Though some of the sources I looked at showed around a 50% inflation (1$ in the year 2000 is roughly equal to $1.50 today) If you look a the launch prices of some of the high end GPUs from back then, they were paying $500 in some cases (Geforce 3). Which would be $750 today. Only real outlier is the 2080Ti, but they didn't do their usual wait to release the Ti card.

Hard to quantify R&D costs and paying for process node improvements though so it could all be wrong.
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
Maybe. I can't be totally certain, but can only state what I think.

I remember paying £400 for the top of the line Nvidia card fifteen years ago. That was a GeForce FX 5900 Ultra. It was an Asus one, the V9950. (256Mb of DDR2).
https://hothardware.com/reviews/asus-v9950-ultra--geforce-fx-5900-ultra

That means prices have tripled in fifteen years. (Or doubled and a bit, (considering you can still sometimes pay £1000 for a 1080 Ti,) prior to RTX.) I have no idea if that is in keeping with inflation. I think I saw no decrease in Nvidia prices when the 600 series dabacle was going on.

It still indicates that Nvidia think we should pay more for slower frame rates if we want Ray Tracing. ... From my perspective, at 1440p because that's my monitor. If I can't get 60fps on a 144Hz monitor, what's the point?

Ray Tracing may look better. However we should not be paying more for it. It's just what's new. .. It's almost analageous to saying, eg. a new AA system comes along and is adopted. Nvidia charge more. Or e.g. the new AA system does the same job for less graphics load. Then Nvidia charge more for their cards, because we get faster performance. Even though they did nothing.

I think as I said, they are beefing up prices, because of mining and because of graphic design capabilities. I also think any other company would pretty much do the same. ... Sadly even if AMD could come back with something to save our pockets from Nvidia, miners would buy it.
 

Eximo

Titan
Ambassador
Launch price for the FX 5900 Ultra was $499 in the US. 2003-2018 inflation was 37.44%. That would make it the equivalent of about $690 today. Basically the launch price of the GTX1080. Which puts the RTX2080 only a little bit above inflation, and it is using shiny new memory technology, not sure if US tariffs have gone into effect, I think that is soon. They seem to have kept prices relatively flat.

Really depends on how you look at it. You could say you get Ray Tracing and Tensor Cores as a bonus. It does make the silicon bigger, and that means less chips per wafer. Might not work so great on huge titles like Battlefield, but you might start seeing more simple games take advantage of it. Could produce some really cool effects if people decide it is worth it.
 

mjbn1977

Distinguished
To much complaining about price. I don't understand it. Only good competition can bring prices down. But there is no competition for the RTX 20series. So, now its AMDs turn to come up with a response. But I think we are at least 2 years away for any serious competition for the High-End segment. And by that time Nvidia probably having RTX 30series ready....
 

Eximo

Titan
Ambassador
Well I said I was going to jump on the 2000 series Ti card when it launched, then they launched it at the same time as the others and set the price really high. Without the usual 80 launch price reduced to make room for the Ti it is just too much to spend. So here's hoping for an Nvidia die-shrink by EOY 2019 (I'm stuck with G-sync so, probably not looking at Navi)
 

mjbn1977

Distinguished


Well, you will not get a high-end enthusiast Navi card in 2019 anyway, since AMD will start Navi with the mid market segment and talks about releasing high-end Navi in 2020
 

mjbn1977

Distinguished
Correction in regards to my last post in this thread. It seems that AMD is actually starting with Mid-Range Navi cards in the first half of 2019 and planning to launch a high-end card in the second half of 2019. This is supposed to be matching and coming close to the performance of the RTX 2080, but without Ray Tracing features.
 
Yeah I'm really excited to see what AMD has to offer to compete with the turing cards!!! Hope vega 2 turns out to be as good as it's rumored to be.

FYI Guys, make sure any content you post here has some relevance to Turing or Volta, anything else here is off topic.
 

mjbn1977

Distinguished


My post was in reverence to Turing. it was a correction in regards to one of my earlier post that was about the Turing pricing to be high because of no real competition and that that wouldn't change soon due to no planned high-end Navi card in 2019. But I was wrong and just corrected that in my last post, since according to AMD there is a Navi card planned for the 2nd half of 2019 that should compete in performance with the RTX 2080. And that could indeed have some impact on RTX 2080 prices.