Which is NVIDIA's target market with the 20xx series of GPU?

Jun 13, 2013
79
0
10,630
Their most expensive GPU, the 2080 TI cannot play with Ray-Tracing at 1080p 60FPS (Lets not even talk about graphical settings). So the benefit of this card is that in normal play, is 40% more powerful than the 10xx series.
Games come optimized for consoles. The current 1070/1080/1080TI an I think even the 1060 and RX 580 are more powerful than the most powerful console out there, Xbox One X. With the release of the 20xx series, the 10xx series will go down in price. A couple of weeks ago you could get a Zotac 1070 mini for $290. If you have one of those cards, you are pretty much set for the next 3 years until the new generation of consoles comes out. Dont you guys think that? If you get a 1070 at a discount you are guarantee 1080P ULTRA 60FPS and 1070 High-Ultra 60 FPS for a cheaper price.
 
Solution
Doesn't matter the rank. Majority that you list down are not that demanding. Though i know frostpunk is quite demanding. My 970 can't max out that game and get 60FPS at 1080p. Also the real challenge is not 1080p. It is 4k. Running demanding games like witcher 3 with max setting at 4k res with 120fps. There will be market for the likes of 2080ti.
"Games come optimized for consoles"

Yeah for consoles, or console ports. But for games designed around the PC, yeah no. AAA titles bring even the beefiest cards to their knees. (I should add that a top end GPU can definitely last 3-4 years, but never expect to play at ultra settings for 4 years strait unless you don't mind slower frame rates or you play at surprisingly low resolutions.)

If the 20 series can perform that good with pure rasterized games, then the 20 series is marketed to every gamer once the entire lineup is fleshed out.
 
These are the top 8 PC games in gamerankings:
1) In to the Breach
2) Pillars of Eternity 2
3) Frostpunk
4) The Banner Saga 3
5) Warhammer : Vermintide 2
6) Civilization 6
7) Battletech
8) A Total War Saga
Tell me, which of this games need anything above a 1060 to play well? Monster Hunter World runs at ultra with a 1060. Sekiro, Devil May Cry 5, Resident Evil 2 require less power than Monster Hunter World.
 
Doesn't matter the rank. Majority that you list down are not that demanding. Though i know frostpunk is quite demanding. My 970 can't max out that game and get 60FPS at 1080p. Also the real challenge is not 1080p. It is 4k. Running demanding games like witcher 3 with max setting at 4k res with 120fps. There will be market for the likes of 2080ti.
 
Solution


I can see that. For the games that wants to take 4K from 60 GPS to 120FPS. That is only 1% of PC gamers. For the rest?
 
Honestly, I stand by what I say. If you want guarantee 1080p ULTRA 60FPS for the next 3 years, get a 1070. Is you want guarantee 1440p ULTRA 60FPS for the next 3 years get a 1080. Obviously, get both at a discount price. I do not know who wants to pay more than 1K to play at 1080p. Most people willing to pay that much money, are already playing at 4K 60FPS with a 1080ti.
 
Not 1% for sure, but definitely not the majority.

If you look at steam survey over the past several years, the GTX 760, 960, and 1060 all win by far in how popular they are among the gaming community. Cause they are the best price to performance in the GeForce lineup.

But there is still quite a few, I believe at least 20% of gamers who buy 1080s and 1080 Tis. So the demand is definitely there.

The reason for the GTX 2000 series is simply to replace Pascal with more performance. Even if the performance seems overkill right at launch, that's the point. So that the card is still very powerful even 2 years down the road.

Then going back to the games themselves, again it's ALL about the games you play. That's the nice thing about PC gaming, it's centered all around YOU, you can choose when to upgrade or not.

EDIT: Ok I think I understand now, your just talking about the Ray Tracing itself. That I'd fully agree with you, unless your a Ray Tracing guru, it doesn't make sense to buy a $1000 card just to get ray tracing support, especially at 1080P.
 
The new cards are for early adopters. There are people out there who like being among the first to own a new thing. These people are willing to pay to be able to play the first ray tracing games using the first real time ray tracing hardware.

The majority of people will be using older hardware for some time to come. Game developers and publishers know this, which is why no one will 'need' a ray tracing card for some time. Most games are designed for the largest possible market. It's going to be a long time before ray tracing is a large part of the market, assuming it ever is.

There are game developers who like to push the edge. These will be the ones to tack ray tracing support into their upcoming games.
 
I don't think high end gaming is 1% only. Probably not as many as mainstream gamer but they were there and most likely increasing. JPR did a reasearch on the revenue for gaming hardware. Back in 2016 the most money coming from the high end segment. In the past the mainstream hardware bring in the most revenue because of volume but recently that's no longer the case. Volume wise the high end segment still not beating the mainstream segment but the fact the revenue already surpass mainstream segment most likely pointing to increase in gamer that into high end stuff. Why sony coming out with PS4 Pro? They outright said it was done to prevent their existing userbase to leave the platform for high end gaming pc. Those PS4 pro is those that want to play with high end stuff like 4k res. We never see this with 7th gen console. The fact that console maker jumping on the "mid gen upgrade for more powerful console" shows that the interest in high end gaming is increasing.
 


But you can have a REALLY powerful card right now but lets say that in 3 years, with the launch of the new consoles, the new standard is 11GB of Ram. What good is it to have an all powerful 2070 or 2080? You will just be able to play in medium settings.
 
Yes of course. The flagship GPUs are for those who want the best of the best NOW.

But, I should add, even cards like the 980 Ti can run ultra settings at 1080P and 1440P without any problem (maybe turning down textures a touch but really that's it). So really you'd have to jump to 5 years before going to medium settings.
 
We try to answer you thread tittle: what is nvidia target market for the new 2000 series. But it seems that want tou really want to talk about was how useless those expensive hardware are. So according to you: want good performance? Just buy current mid range hardware like 1070/1070ti. Yes those 2080 or 2080ti is very fast right now but it is useless anyway when new console coming out setting the bar higher. To me all you want is complaining about those expensive hardware?
 
Regarding the example of the 980ti. It is a current generation GPU. Therefore it has 6GB of Ram. It was also the most expensive GPU release at the moment so I am not surprised it can still handle itself. Key thing is RAM. The new cards will come with 8GB of Ram. In 2-3 years games will need more than that. It seems to me that the 10xx series can hold up at a much cheaper price for those 3 years. After go get an expensive card with much more RAM. Do not know if I explain myself?
 
I disagree, while VRAM capacity is very important, pure performance out of the GPU cores themselves and memory speed is also just as crucial.

Plus, game makers DESIGN games around graphics cards aswell, so if lots of cards don't have over 8GB of vram, then they simply won't make a game that requires that much vram, but will make the game use other resources more effectively for better image quality.

The key question is, does the extra vram help that much during the cards lifetime. Or rather, will the game devs make games that allow for lots and LOTs of vram usage to make it worthwhile for Nvidia to push extra money into adding more vram capacity.

Also, we don't know fully how games are going to be or look in 2-3 years. With mores law slowing down, adding more power in a linear fashion as has been done in the past might not be the way forward. You just never know.
 
There are a lot of unknowable variables that could drastically shift the videocard market. One possibility is resolution. If the prices on 4k monitors dropped low enough that anyone buying a monitor might as well just buy a 4k monitor since it costs the same, suddenly there might be a large increase in sales of 4k capable videocards. That will mean large amounts of vram become the norm. That will impact the way developers design their games.