Nvidia GTX 1180 Expected in July: What You Need to Know

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
May 17, 2018
1
0
10
@ SCOHEN158 Your scolding people who are posting valid thoughts and concerns on the published article with questionable sources. Based on your "a clock is right twice a day" defense, which is laughable, the best course of action is to just remove your invalid arguments. I would advise If you have nothing to add that can provide further information validating or supporting this article, then you may want to take your trolls to a soap drama website, where people might appreciate such useless posts.

Ive been waiting on the next drop of Nvidia chipsets for months. Rumors without any solid foundation do more harm than good. Nvidia are very secretive about there tech, but that does not excuse false reporting.
 
May 17, 2018
2
0
10
The author really has no idea what he's talking about. NVIDIA neither delivers the GPU nor the memory. TSMC produces the chips, and one of Samsung/Micron/Hynix produces the memory. Look to GDDR6 shipment as the biggest milestone.

2015 January: Micron/Samsung announces production of 20nm GDDR5
2015 June: Micron announces commercial shipment of 20nm GDDR5
2016 May: NVIDIA announces GTX 1070/1080

Memory companies just recently announced mass production of GDDR6, and none of them have reported commercial shipments yet. With the new memory on a new die-shrink, I imagine a lot of manufacturing difficulty causing delays.

Also, with Pascal raking in loads of cash, and no threat from AMD in sight, do you really think NVIDIA is going to rush out the GTX1170? They won't, but even if they wanted to, they can't, because they're bottle-necked by the memory producers.
 
May 17, 2018
2
0
10
To further my point, Samsung/Micron/Hynix are also raking in loads of cash selling existing memory (probably even more than NVIDIA), so they are also in no rush to push the technology envelope. Latest accusations (class action lawsuits) have it that the three memory producers are colluding in a scheme to fix memory pricing. If true, then they are probably colluding to NOT release GDDR6 so that they can milk from GDDR5 as much as possible.
 
Sep 30, 2013
281
0
10,810
The founders edition is stupid so why do they even make that crap.
Also 1170 is likely more the teir I should have buy waiting is boring.
The SEK is weak and they will charge even more than before. Terrible.

At 2ish dollar per AMD share I tried to buy just for 4000 SEK but as usually just a bit below current price even though I waited for the actual trend shift to happen first and that didn't happened and was stupid like always.
Otherwise that would easily had paid for a whole machine.
(The original idea was to get one for the price of a graphics card.)
 

bit_user

Polypheme
Ambassador

Are you familiar with the concept of market saturation?

There are two incentives for Nvidia to get on with the next generation. The first is that most people who want a Pascal GPU and can afford one will have bought it (i.e. classic market saturation). The second is the projected decline in the cryptomining-based GPU demand, meaning they need to do something to create demand and drive sales, in order to replace that revenue. And I guess a third one follows from this - if cryptomining-driven demand does plummet, then there'll probably be loads of used GPUs flooding the market and Nvidia will need to give some people a reason not to just buy those ...or else their revenues will take a double-hit.

Of course, because of cryptomining-driven price inflation, the market saturation among gamers (and other semi-pro users, like @kyotokid) is less than it should be, after two years. But, a lot of folks who've been waiting for prices to drop are probably now going to wait for the next gen, because they feel it's about time and don't want to be buying into old tech.
 

Kaziel

Distinguished
Jun 17, 2013
27
0
18,530
@RYGAR

You got it mixed up. The rumours state that the Vega 7nm (2018) is the Pro only card whereas the Navi (2019) is the 1080 performance for ~250 bucks.
 
If Navi match a 1080 GTX performance, I am just going to crossfire two of them and get rid of my 1080 TI.

As funny as it sounds, I believe Vega 7nm could be a monster in the consumer market. The latest rumors were mentioning something like 70% performances increase for the same clocks. It seems to be too good to be true honestly.
 

KidHorn

Distinguished
Oct 8, 2009
269
0
18,790
I assume the first cards will be gobbled up by crypto miners. Now is a lousy time to be buying hardware. Wait until crypto's go bust. And they will.
 


You forgot the 1130 "My grandparents bought me this for christmas, but can still play fortnite at 30fps edition"
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165
All of this anticipation, only to be ruined by crypto currency miners who will once again be solely responsible for exorbitant prices on day one. Good luck actually trying to get one. It was tough enough to get one in the initial month of release before the crypto currency madness. Now it will be just about impossible to get one for several months after initial release. Sad. What Nvidia should have done is create separate lines of GPUs; one that can crypto mine and one that can game, but not crypto mine.
 

mlee 2500

Honorable
Oct 20, 2014
298
6
10,785
NVIDIA 10 series is "getting long in the tooth"? Are you kidding me?

"We've been through a number of Intel and AMD CPU generations (since the release of the NVIDIA 10 series two years ago)."

You're joking, right?

Core for Core, a top of the line i7 CPU is only about 30% faster today then an i7 processor from 2012...*barely* qualifying as even a single generational improvement, IN OVER SIX YEARS, unless you factor in the additional core counts (and even then....).

During those same six years, NVIDIA has released the GTX 600, 700, 800, 900, and 1000 series GPU's...FIVE iterations, each substantially and materially faster then it's predecessor, and easily qualifying as at least three "generational" levels of improvement.

If you had a top of the line i7 processor in 2012, you would have been better off upgrading the video card every two years until now then bothering to upgrade the CPU even once. That's a fact.* And heading into 2019, that math doesn't seem to be changing.

When it comes to semiconductor companies not iterating their performance substantially and often enough, it's not NVIDIA that's coming up short.


*Combine a GTX 900 or 1000 series card with a 6 year old i7-3770. Compare that to any older GTX Series card paired with the latest Intel i7-8770. One of those combinations will still rate as a higher performance rig, the other will barely be able to play the latest titles at all.

Hint: It has nothing to do with whether you use the latest or oldest Intel product.
 

Anonymous_4

Reputable
Nov 7, 2015
2
0
4,510
FINALLY. WHAT TOOK YOU SO LONG? DO YOU REALLY HATE CUSTOMERS THIS MUCH NVIDIA? HAVE SOME BRAVERY TO PLEASE THE BUYERS THAT GOT YOU TO THIS POINT IN THE FIRST PLACE INSTEAD OF SHAREHOLDERS! THE LEAST YOU COULD HAVE DONE WOULD BE TO PUT OUT A RELEASE DATE FOR THESE CARDS! AND YOU WONDER WHY PEOPLE LOVE AMD!
 

mlee 2500

Honorable
Oct 20, 2014
298
6
10,785


Five generations of MODEL numbers, maybe. And the older board and CPU may throttle the card a bit...but you would still be better off with latest NVIDIA card combined with a six year old i7-3770k, then with any previous generation GTX GPU combined with the latest i7-8770k.
 


1) Yeah and the Fury X was rumored to be a GTX 980 Ti killer too. It was everything BUT that and drew more power.

2) Good luck in that second AMD GPU scaling in games in the future. Multiple GPU support is a dying platform. AMD still has no competitor to the top level GTX 1080 Ti class GPU to remind you.

 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680
"In other words, the high-end card will be called either the GTX 1180 or GTX 2080."

i don't think cut-down/partially disabled dies qualify as high-end, esp when lesser-hobbled Ti & Titan versions of the same die are released months later once the process/yields mature. unless you think an MSRP of $750 is reason enough justify the "high-end" moniker?
...i miss "flagship" series launches. now we get mid-range launches, followed by $1300 flagship GPUs.
:/
 

bit_user

Polypheme
Ambassador

The *80 model of their last two generations hasn't been cut down from anything.

GTX 1070, GTX 1070 Ti, and GTX 1080 are all based on the GP104 die.

GTX 1080 Ti, Titan X (Pascal), and Titan Xp are all based on the GP102 die.

That said, I think we can agree that the *80 is upper mid-range - not high-end.


This is more or less a matter of economics. Although, if they use 12 nm, I'd imagine it's mature enough that they could actually start with the Titan card.
 

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865


It has already been proven many times over that AIB 1080 Ti's have no problem maintaining 60 fps @ 4K at Ultra/high settings in all but a handful of games.

This sounds like the typical procrastinator's lament.
 

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865

The all-caps post made me NOT read it at all, and reply simply to tell you that all-caps doesn't work for anyone.

Try again.
 

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865
Anyone notice the 1180 specs are equal to/lower than the 1080 Ti's, save for a ~7% increase in mem clock speed and GDDR6 memory???
Many of the crucial specs are quite a bit lower.

If the 1180 surpasses the 1080 Ti by 15% through the "magic" of a smaller die I'll be shocked. It looks to be 1080 Ti 2.0 at best. The 1180 Ti on the other hand... well, we'll see.
 

cerealkeller

Distinguished
Jul 30, 2009
23
3
18,525
If that's the spec these cards will barely outperform a 1080 Ti. Nope. I'll wait for what comes next unless it's at least a 30% performance improvement across the board. Updated process so about maybe 10% improvement per clock, GDDR6 so maybe another 5% or a bit more. Ray tracing is their big selling point here. Not worth the money for 1080 Ti owners. Add HBM2, then maybe it would get closer to the 20+% improvement over last gen.
 


While it's possible that there could be a resurgence of cryptocurrency mining that significantly affects the prices of these cards, signs are that the demand for mining cards is already waning. Prices of graphics cards have been on a continuous downward trend in recent months, and at least at major online retailers in the US, most cards are currently priced within about $50 of what they were a year ago, before mining started really affecting prices. If this trend continues, existing cards should be back near their suggested retail prices by the time these new cards launch.
 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680

...actually GDDR6 is faster in tests, less expensive, and easier to produce than HBM 2. For now, I'm content with the Titan-X I recently picked up. Yeah, a generation older than Pascal, but a big step up from what I used to have, perfect for rendering the large scenes I create and I can't complain about the price I paid.

 
Status
Not open for further replies.