Nvidia GTX 1180 Expected in July: What You Need to Know

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bit_user

Polypheme
Ambassador

And why is that such a dismal prospect?

Seriously, Pascal was considered a big improvement over Maxwell, and yet the GTX 1080 wasn't massively faster than a GTX 980 Ti (which is about equal to a GTX 1070).


Of course not! And neither was the GTX 1080 worth the money for most GTX 980 Ti owners. I have one and didn't even consider a GTX 1080 Ti worth the money. Probably the GTX 1180 Ti will deliver the ~2x improvement I'm seeking.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


or also known as wife threatens to leave you unless she can buy something equally ridiculously expensive edition.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
And let's not forget these up and coming models:

1130- The 'you know we made these to use as integrated but Intel said no editions'
1130TI- But we added 1GB vram edition, but wait, they still said no.
1150 v.2 - We added 3GB but it still sucks and many people can't figure why it struggles so much at 1080p edition
1160 v.2 - We found more spare leftover parts from the prev. gen to sell you edition
1160TI- Even more spare parts but 2% increased performance edition
1170TI- Well, these weren't quite up to spec for 1180 so here's a discount edition
1180 v.2 - Our latest Vram fab is finally up and running 2 months after debut edition
Titan XXXp- Take out some loans and then realize you were better off with a 1180TI edition.

edit: Can't forget the marketing label that'll be slapped on all of them "4k Ready!" and then countless questions posted by users who don't get why they can't play ultra/60fps still or worse 144fps/4k.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


Everyone, please keep in mind that these are only rumored model names. Take this list with a grain of salt until Nvidia's Turing cards actually launch.
 
Just to clarify things on GPU prices folks, it's not just miners. It's memory chip demand. Samsung, SK Hynix, and Micron are having to compete with their fabrication between GPU GDDR5 memory modules and smart phone DDR4 memory. This is also why PC memory has shot through the roof. Both Samsung and Apple are creating heavy demands on memory fabrication for smart phones.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165

Valid sources with actual real-world results please?
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165


Not true. Those are different fabrication processes that come from different factories altogether. Smart phones have been using DDR3 and DDR4 for a few years now, so there is no reason for all of the sudden for their to be an issue regarding manufacturing that would drive up prices. GPUs are overpriced for one reason only - crypto currency mining insanity.
 


Nope! That's actually called the "GTX 1130 You Shouldn't Admit To Having It Edition". This edition comes with a large box of tissues and a sympathy card. :pt1cable:
 


Wrong. DRAM fabrication has been biased towards DDR4 smart phones and the server market demand. This is not new either. It's been going on since summer 2017. Read on for more info:

https://www.anandtech.com/show/11724/samsung-sk-hynix-graphics-memory-prices-increase-over-30-percent

http://www.advancedmp.com/memory-prices-on-the-rise/

Something else not mentioned in those articles: AIB video card partners can ramp up more production but they are waiting for GDDR5 chips to come in drip by drip. Also, the extreme popularity of the game PUBG has exploded in the Asian market, specifically China. That's also causing a strong demand on video cards. It's really a perfect storm against GPU buyers. Then there was a power outage earlier this year at a Samsung NAND fab causing the loss of an entire day's worth of production to be destroyed (estimates were around 3.5% of global NAND production for March).

So to say that it is ONLY miners causing a shortage of GPUs and price spikes is displaying a lack of information on the market conditions and the causes. Please understand I am not defending the stupidity of mining ruining we PC enthusiast's GPU upgrade wishes. But I just want everyone to be informed on the fact there are several reasons why we are shafted right now between DDR4 memory and GPU prices. Miners are without a doubt the plurality of the cause, but they are far from the only cause.

 

bit_user

Polypheme
Ambassador

So, what does NAND have to do with this?

BTW, I think it was just a power outage.
 


Yep, and only a 30 minute power outage at that. But it set Samsung behind schedule on NAND production meaning they had to divert resources to get their numbers back up on it. That, combined with an already biased production shift towards smart phone and DDR4 memory demand has effectively pushed the GDDR5 memory makers into having to balance production between the two. Some analysts are even claiming that the Big Three (Samsung, SK Hynix, Micron) are being pressured by Apple to push out more DDR4 smart phone memory. I have not seen proof of that however.
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680


they are all from the same architecture (Pascal), with "G_1__" just being a codename. all but 1 is either cut down from the full design, or partially disabled to create the product tiers. the last time Nvid launched a series with the uncut/non-disabled flagship of a GPU arch was the 580, and even then it was just a fully-featured Fermi 480 rehash.

semantics aside, looks like we're on the same page as far as no-longer considering a $750 GTX**80 card as "high-end", which is what truly bothers me. it's been done deliberately by Nvid marketing to nearly double the MSRP of the "flagship" card within a given architecture. a massive price hike in a short time period (amidst a global recession, no less).
 

bit_user

Polypheme
Ambassador

I think what you call "cut-down" is what Nvidia would say is "scaled-up". The way modern GPUs are designed is with sets of identical blocks that can be added and removed (during the design phase) to create larger and smaller chips. It's not like the GTX 1080 is literally a Titan Xp die with 1/3rd of the area chopped off or sitting idle.

Furthermore, there are sometimes differences in the way parts of the chip are connected. In AMD's HD 79xx series, they used a crossbar, while the 78xx series didn't (and instead used like a ring bus or something).

Anyway, there are a couple reasons Nvidia might not have started (for consumers) with the GP102. Maybe it had to do with yield and fab costs, maybe it had to do with the GTX 1080 already beating all AMD cards (of the time), as well as their entire previous generation, and they wanted to hold something back in case Vega was stronger... only Nvidia truly knows the reason.
 
I disagree on the suggestion that a GTX 1080 is anything but a high-end card. Perhaps the product lineups and pricing have been shifting a bit, but it ultimately comes down to what kind of hardware is actually necessary to play current games at the resolutions most people are actually using. By far, the most common resolution used for gaming is still 1080p, and even a GTX 1060-level card can run nearly all recent games with high settings at that resolution while maintaining around 60fps. So, a GTX 1060 could be considered mid-range, as it does a relatively good job of meeting the needs of most gamers when paired with common hardware, even two years after its launch. Only when paired with a high resolution display should people really feel that a card like that is lacking. At a higher resolution like 1440p, a higher-end graphics card might be needed for adequate performance. And a GTX 1080 should do a very good job at handling that "high-end" resolution, arguably making it a high-end card.

As for 4K, it's still pretty much on the borderline of what's usable for playing recent games at high settings and frame rates on even the highest-end hardware available today. It's an enthusiast-level resolution, and therefore an enthusiast-level card like a 1080 Ti or Titan is required to get the most out of it. Enthusiast-level hardware has always been really expensive compared to what most people are buying though, at least any time in the last decade or more. And while 4K might be getting pushed as the next big upgrade over 1080p in the world of televisions, in the gaming world the jump from 1080p to 4K is much larger than the more gradual progression of resolutions that is typically seen. And since it generally takes larger screens to really see the difference with a resolution that high, it might not even be ideal for everyone. It would make more sense for the bulk of gamers to transition to 1440p before thinking about 4K. One can't just expect graphics hardware to suddenly get three or more times as powerful overnight to be able to maintain the same frame rates while rendering four times the pixels at the level of graphical fidelity people have come to expect at 1080p.

And ultimately, the rate of performance improvements for computer hardware is gradually decreasing as it becomes more difficult to shrink hardware further, and barring some major breakthroughs that vastly increase computing capabilities, advancements in graphics hardware performance may start to stagnate. I could definitely see things like foveated rendering with eye tracking being used as a way to get more performance out of the same level of graphics hardware, but even that will have its limits, and it isn't really something that's supported today.
 

AnimeMania

Distinguished
Dec 8, 2014
334
18
18,815


I agree with everything you said, but me not considering myself to be incredibly tech savvy, I try to purchase the highest end computer I can afford, hoping the tech (especially motherboard ports/slots) will still be relevant for 5 or more years, when I feel that the tech might be outdated enough to consider buying a new computer again. I have had my computer for over 7 years waiting for something mindblowing to happen. I have added internal and external hard drives, a SSD, upgraded OS, more memory and faster graphic cards. What I wanted to change but didn't was USB ports, I have USB 2.0, but external drives work better with faster USB ports. Although 4K, HDR, VR, and Thunderbolt might not be feasible now on the average computer, those are the types of things I might want to be able to do in the future that might make me think a new high-end computer might be worth getting. Right now I am looking for any excuse to purchase a new computer and I am hoping that handling 4K/VR really well might be it. Then wait for 5+ years for the next new innovations to make me want a new computer again.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
Its too early for 7nm cards. So if they are not waiting for 7nm...

Whats probably going to happen is a new 12nm chip for titan/1080ti. Then the existing 1080ti chip will be rebranded as the 1180, with a cut down version called the 1170. The existing 1080/1070 die will be rebranded as the 1160, and a cut down version for the 1160(4gb), and so on.

I bet we do not see a full stack refresh, its going to be partial rebrands and partial refresh at the top end.

On top of that each card is going up 50-100 in price. So a 1060 will be 300, a 1170 is going to be 400, a 1180 is going to be 600, and so on.

So you will get an 1160 which is the same as a 1070, and it will cost you 1070 price. But it will be called a 1160, so i guess thats an improvement or something.

Thats my prediction!

NOTE: The existing dies moved to the 12nm proces(which is really 16nm+) counts as a rebrand in this prediction. If they significantly change the core, then its not a rebrand.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165


Those links do not provide any technical reference to the actual fab process. They are different for assembling the memory chips onto PCBs in order to be used in smart phones vs. PC motherboards. In all, it's an artificial price hike in order to gain profit. I hope you are not just another sheep foolish enough to believe the lies. Or, you could be an employee of one of the aforementioned memory manufacturers and are only promoting their BS.
 


1) I'm just reporting what other tech sites are reporting. You think they just make things up for filler stories?

2) What do you mean by "actual fab process" there? You think the memory makers are going to give up their industry secrets publicly? And what PCBs are used between different chips irrelevant. It's like a Chevrolet or GMC truck made on the same assembly line at one plant: only X amount of Chevy's and Y amount of GMC's can be made on a given production cycle run. I can tell you for a fact that if Chevrolet sold more trucks than GMC, the GM plant in Texas would shift production more towards Chevrolet trucks to meet demand. There is currently more DRAM demand by smart phones and servers than on PC GPUs. Why is this difficult to understand?



Nothing but speculation with ZERO evidence. Not a shred. Just as you accuse my links of doing. And it's rather laughable when people like you come out of the woodwork claiming commenters like me are insider paid employees. I WISH. Now go find your evidence links and get back to me. I gave you mine from credible sites. And I'll close with another link rated to the above comment of mine referring to memory demand:

https://www.marketwatch.com/story/micron-bets-that-memory-demand-is-here-to-stay-2018-03-22

And I quote from it:

“For the data center, 10 years ago PC DRAM demand was seven times bigger than server,” Longbow Research analyst Mike Burton pointed out in a recent note. “Today, the demand for DRAM in servers is twice as big as PC.

This is not 2005 when memory chips were only put in PCs, laptops, and video cards. There is an explosive growth in many different market segments requiring memory chips and not just PCs, servers, and smart phones. This is the future. Now. Everything from automobiles to VR headsets are included as well. Samsung is building another fab plant right now to meet more demand in the coming years ahead (https://www.anandtech.com/show/12498/samsung-preps-to-build-another-multibillion-dollar-memory-fab-near-pyeongtaek). Just to let you know, Samsung by themselves makes half of the global supply of DRAM chips. I would suggest you bloviate and speculate less and better inform yourself more. :pfff:
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680


playability & resolution have always been moving-targets, arbitrary goals that move continuously, and have no hard figure bound to measure by. further, you're ignoring the 120hz+ market entirely with this argument.
...cost on the other hand has been a very reliable/predictable factor. adjust for inflation, and you'll see the card pricing within a product stack has been relatively stable since the 1st dedicated GPUs. in 14+yrs of data, "Enthusiast-Level" meant $450-700 MSRPs for the best consumer graphics card avail, and didn't even rise steadily (some bumps & dips in there along the way). once the "best consumer graphics card" breached $700 by a WIDE margin, it also took with it the meaning of "mid-range". you can't separate the two.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165


You have zero evidence. Quoting vague tech stories is not proof. Your childish approach to generalize everything is sad. There are several differences between smart phone and computer memory such as voltages and actual memory bus bandwidth. There are a few things in common, but ultimately use different manufacturing processes. One can not be used with the other due to these differences. I'm right and you are wrong. GET OVER IT.
 

bit_user

Polypheme
Ambassador

That's pretty strong wording over such low stakes. It's not like any of us can do a thing about this, so maybe give it a rest.

BTW, the irony about calling someone childish is that it actually makes you look weak - as though you can't argue the issue on its merits. And if you feel the other party is not abiding by the same rules or standards as you are, then it's better to withdraw from the debate than resorting to that.

Basically, internet arguments cannot be won. Once you've presented your facts and opinions, you need to give it a rest, whether or not they're accepted by the others.
 


Yet here you are, repeatedly fanning the flames. If you had taken your own advice, it would have been advisable to either not bother engaging in the first place or stop once you had presented your side, no?

I don't really have a dog in this fight but come on, man. Your hypocrisy is showing. That goes for NWAG as well.
 

bit_user

Polypheme
Ambassador

What are you even talking about? I don't have a side.

I'm not saying I've never gotten in a flame war on here, but I think it's been quite a while and I never resort to insults. So, not really sure what you're calling hypocrisy.
 


You've been going on and on with NWAG over what amounts to nitpicking and classic one-upping each other. Why? That's what I'm talking about and it's... unnecessary. That's all.
 

bit_user

Polypheme
Ambassador

I think you're confused. My only reply to NWAG is the one above. And he never quoted me.
 
Status
Not open for further replies.