Nvidia GTX 1180 Expected in July: What You Need to Know

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dimar

Distinguished
Mar 30, 2009
971
24
18,985
0
AMD's failure is the reason I got 1080 Ti. But I refuse to buy $3000 monitor. I'm happy with my Samsung CHG70 which already cost close to $1000 CAD. I'm just mad that nvidia can't support the adaptive sync feature.
 


What are you talking about? Are you saying you need to spend thousands on a G-Sync enabled monitor? Come on man. Yes, everyone knows G-Sync monitors cost more. But 10x more? Don't be ridiculous.

By the way: I just checked out of curiosity on NewEgg. For gaming monitors, they list 669 Freesync monitors to 185 G-Sync monitors. Under the 2K/4K monitor menu, those numbers are 199 and 54 respectively. In both cases there are options under $600. You make it sound like there are no G-Sync options without taking out a bank loan.
 

bit_user

Judicious
Ambassador

I'm sorry you feel those are equivalent. Regardless of the competitive landscape, the lack of support for the latest VESA standards is a fully legit complaint about Nvidia's products.

Your tribalism helps nothing.


Do you think we don't know that?


HDMI also now supports it. In fact, XBox One X is/was slated to receive adaptive sync in a software update.
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680
0


if either company can make it through a consecutive 12 months w/o pulling some anti-consumer or illegal shenanigans... i'll stop informing people who don't know the ugly history of these two companies. promise. it's been 6/25yr & 10/40yr, respectively, so it's not good odds. but you have my word!

we'll start back at Nvid with GPP, and Intel with 28c 5GHz chiller. cool? :)
 

aries1470

Distinguished
Jun 26, 2008
37
0
18,530
0
Proof reading anyone at Tom's?
"The chips will be based on the Turing platform, but Nvidia hasn't announced a brand name. We've seen conflicting rumors that state that the new cards will be called either the 11-series or the 20-series, which suggests that the high-end chip would be either the GTX 1180 or GTX 1080. "
So, we are going to get hte current chip, again? Hmm.. Very interesting.
 

bit_user

Judicious
Ambassador

Also kinda off-topic, doncha think?

Most people on here will have heard of GPP. So, I don't get what you hope to achieve by mentioning it. They already got a black eye for it and had to walk it back.

IMO, this thread should be about their upcoming products, not a referendum on whether you like Nvidia as a company. The more we can stay focused on the facts (and, in this case, rumors) rather than tribalism, the better.
 
Totally legit leak: The R in "RTX" stands for "Radeon". Nvidia decided to quit designing their own chips and instead put Vega inside. : D


It wouldn't surprise me if Nvidia were to support VESA Adaptive Sync (FreeSync) with their new graphics cards. Or at least, I suspect there's a good chance of it happening eventually. While Nvidia might like the idea of profiting off their proprietary G-Sync chipsets for as long as they can get away with it, there's a wide array of monitors that only support FreeSync, and for someone with one of those screens, the lack of support in Nvidia cards might only encourage them to stay out of the Nvidia ecosystem entirely. Your numbers were somewhat off though, since it looks like they include marketplace and open box/refurbished items, which might include some additional models not stocked by Newegg, but also results in a lot of duplicate listings. Restricting the results to only new screens direct from Newegg, the ratio of G-Sync to FreeSync screens remains almost the same, but the numbers are much lower, with only 93 advertised as supporting FreeSync, and 26 supporting G-Sync. That ratio is still close to four to one though (and there are also some screens that support FreeSync, while not explicitly stating that they do).


This doesn't really matter a whole lot though. The market for people buying $700 graphics cards for home use is fairly small, so AMD shouldn't necessarily feel a pressing need to compete in that market. Going by Steam's hardware survey, only a little over 1% of their gaming userbase have a graphics card of that performance level, and only a similar number have a 4K display. So, we're talking about relatively niche hardware here. That's in contrast to adaptive sync, which is something that people can make use of at all performance levels.

Now, certainly improving efficiency is something that AMD currently needs to work on for their graphics cards, and it sounds like they're in the process of doing that, though it takes a while to bring such a product to market. Better efficiency would also allow them to compete more directly with Nvidia's enthusiast-level cards. However, that's not just something they can pull out of a hat and have immediately available, since it can take years to redesign chip architecture. VESA Adaptive Sync, on the other hand, is something that would be relatively simple for Nvidia to add in the short term.
 

bit_user

Judicious
Ambassador

In my above link, AMD has stated 7 nm Vega is 2x as efficient as the current generation. However, they didn't say on what workload. Regardless, we'll probably have to wait for a consumer version until early next year.
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680
0


wouldn't be fair of me to offer the deal to 10tacle w/o stating when each 12mo period starts.

it's not tribalism coming from me... i'm a fan of tech & innovation (but also transparency & morals) ...if AMD rebrands the same die while trying to market it as "new & improved", then it's entirely fair to bash on the underhanded dishonest tactic. if AMD had 2 series of rebrands, i'd expect grumblings of "better not be more rebrands" from people prior to the 3rd series launch, cause it's entirely fair to bring up that history to show a trend.

as consumers, it's perfectly within our rights to demand honest products & marketing. if a company has traditionally had a very hard time delivering on those, then we should feel no shame in exposing that behavior each time they get caught, & noting the trend. complacency tells the other party they can do worse next time & still not fear repercussion.
 

mlee 2500

Honorable
Oct 20, 2014
298
6
10,785
0


All I know is that I'm getting Screen Tearing just scrolling through this thread.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
60


The FE's "may" sport a dual-fan cooler (was reading on another online publisher) eschewing the blower designs, perhaps making them a more competitive option in more ways than one to aftermarket options. Time will tell...
 

dimar

Distinguished
Mar 30, 2009
971
24
18,985
0


Are you seriously defending NVIDIA's gsync? A company that's blocking the most popular monitor feature gamers need on purpose??
 
Jul 28, 2018
22
0
10
0
Do we have much idea when 1080's and other older generation GPUs will start getting pricecuts due to the G/RTX 2080 release? When the cards are announced? When they're for sale by third parties? I have almost all the parts for my build but graphics card, and I'm trying to save every penny.
 

bit_user

Judicious
Ambassador

No, we don't. However, what I seem to recall from the Pascal launch is that once benchmarks are out showing how new cards compare to old, and once the new cards' launch prices are known, the older cards' prices will drop to fit within the new lineup.

So, take the example of a GTX 980 Ti. Once benchmarks showed they performed similar to a GTX 1070, that's when I got mine for $450 (FTW edition, after MIR). I don't know if the 1070's were even shipping, yet, but the 1080's were.
 
Jul 28, 2018
22
0
10
0
bit_user

Ah thank you, that's exactly the kind of information I was looking for, someone who's been watching GPU price fluctuations for a bit and has some idea of the trends. Sounds like good news, I was hoping to get a cheaper GPU this August and with the 20th looking pretty solid things look promising.
 

bit_user

Judicious
Ambassador

Good luck. Who knows if we'll see a repeat, but the rumors were that they had a lot of Pascal inventory.
 

DDWill

Distinguished
Aug 3, 2009
28
0
18,530
0
Noticed TechPowerUp already prepared pages for the new GPU's

RTX2080Ti
https://www.techpowerup.com/gpudb/3305/geforce-rtx-2080-ti

RTX2080
https://www.techpowerup.com/gpudb/3224/geforce-rtx-2080

RTX2070
https://www.techpowerup.com/gpudb/3252/geforce-rtx-2070

and a cached page for Titan-RTX
https://webcache.googleusercontent.com/search?q=cache:6b7sxG8LLqYJ:https://www.techpowerup.com/gpudb/3305/titan-rtx+&cd=1&hl=en&ct=clnk&gl=uk


They all say at the bottom "This is a placeholder page. Details will be added as they become available"


So nothing solid yet.. still guesstimates
 

shrapnel_indie

Distinguished
Jan 21, 2010
2,152
10
20,465
277



Only thing we do know for certain is there is an RTX series alright - for the Professional/Workstation environment: Quadro RTX

https://nvidianews.nvidia.com/news/nvidia-unveils-quadro-rtx-worlds-first-ray-tracing-gpu

The rest is speculation based on rumor and "leaks" at this moment... in which we got the GTX 11xx series naming from those too. Until NVidia actually tells us... as you say...




Oh, apparently, NVIdia is teasing... but does the tease mean anything? Is it a glimpse into a future release, or is it as many speculate, the upcoming release?
 

The RTX 2080 naming seems pretty much confirmed. This teaser video that Nvidia released a couple days ago (which is now linked in this article) for the Cologne event repeatedly highlights the letters RTX and the number 20 in chat messages, and even the numbers for the date at the end come up in the order 2080...

https://www.youtube.com/watch?v=F7ElMOiAOBI

I agree that the specs listed in TechPowerUp's database are likely to be based on less-substantiated rumors though.
 

DDWill

Distinguished
Aug 3, 2009
28
0
18,530
0
I think TechPowerUp most likely based these spectimates on the Quadro RTX specs listed on the Nvidia website, as its usual for Quadro cards to match the High/mid/low end spec's of its desktop equivalent, with slight changes to memory and clocks.


I think they are fairly good guesstimates, other than the Titan RTX, which I think will have 4,608 Cuda cores, 576 Tensor cores, and 12 or16Gb G6 memory.


I cant help thinking that Turing, is basically a slightly modded Volta. 512 Cuda cores missing, 64 tensor cores gone, yet there are new RT cores, and Nvidia not listing the amount used?... I think these RT cores will turn out to be optimised tensor cores for raytracing or some slight architectural restructuring involving those 512+64 missing cores..., nevertheless I am sure whatever they have done, its very clever.


From a business point, it makes no sense for Nvidia to invest their biggest R&D budget to date in Volta, and then not reap the benefits gaining back their investment with interest from the consumer market, only leaving Volta to the DCC, HPC, Enterprise, Deep learning crowds... So I think when the hood is lifted, Turing will end up being a revamp of the existing Volta architecture, but just guessing here like everyone else...

I watched that Nvidia teaser a few days back. Another site had a break down of all the Easter Eggs hidden in the nicknames and chat messages.

Not_11, eating gimme 20 (not 11 series, 20)
RoyTeX (RTX)
Mac20 (20)
Eight-Tee (80)
GPS coordinates to the Nvidia's GeForce Gaming Celebration.
AlanaT ( Alan Turing? )
#BeForTheGame Cologne 20.AUG.2018

Just 4 days to go...

How disappointed are we all going to be when they reveal...

Wait for it....

The new bendable NVidia shield!..... ( plus free glow in the dark Nvidia logo mouse mat! )

 

shrapnel_indie

Distinguished
Jan 21, 2010
2,152
10
20,465
277


At the prices they charge these crowds (and can get away with it) I'm sure they've been making their R&D money back already. But yes, it makes sense for them to take advantage of it as a base for their consumer products too.




As you say, we'll find out soon enough (or not. For some, four days is too long.) We'll know if the tease is reality or misdirection. It probably is reality though... they've went through 9 consumer series of 1xx-10xx (excluding the OEM 3xx series.) The old GeForce series without the GT or GTX on the front of the ID number, went from the original GeForce to the 9xxx series, and then changed.
 
Status
Not open for further replies.

ASK THE COMMUNITY