GTX 980ti @ $650 USD = Overprice?!?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

thehotsung8701A

Reputable
May 18, 2015
35
0
4,530
The GTX 980ti is about $150 overprice. It is not aggressive pricing regardless of how expensive the Titan X is. The Titan X is overprice garbage and should not be compare in price to the GTX 980ti. This is great marketing by Nvidia to fool buyer into thinking they got a great deal just because the Titan X was the biggest ripoff possible.
 
Solution
Even with the new Fury X, you pay 650, but you can't max EVERY single thing out and still get above 60, or even playable framerates. You would need many shaders to even be capable of doing that. NVIDIA's pascal is estimated to pack about 5000 cuda cores with HBM, but that comes out in 2016. So no, none of this will be possible for a while. If anything, you can only OC to your desire.

DrakeConnar

Distinguished
Apr 16, 2008
71
0
18,630



I have degrees in both Finance and Marketing. While the cost to produce a product is taken into account in determining a bottom-line break-even point, anything above that is completely controlled by marketing. Finance tells marketing they need $400 a card not to loose money on it. Marketing then looks at what the market supports and their competitors offerings.

The reason we saw an estimated $800 price point and then a real introduction of $650 was to stick it to AMD. AMD says $850, Nvidia says $800. They knew AMD could match the $800 and maybe even perform better, thereby making the 980Ti out-date. So they drop the Ti to $650. Now AMD is up a creek. If they don't command higher performance, they have to take a huge hit on expected profit or they won't sell any Fiji. This is a market war and $650 is an aggressive competitive move from Nvidia's expectations of Fiji. They threw a spiked ball in AMD's court daring them to grab it.

If AMD cannot outperform Titan (and even if they only match it), everyone will skip their "revolutionary" product for a 980Ti unless they take a $200+ hit as well. At this point Fiji will either release at $650 as well or had better just destroy the Titan to justify a higher price. Either way Nvidia wins by cutting AMD's profit and potentially their regain of market share. The only one who looses is some Titan owners who did not need all the power of the Titan. If AMD does wildly beat the Titan at a lower price or $850, Nvidia still commands the sub $700 market as the Ti will very likely outpace the 390X (non-fiji 290x rebrand).


 

g-unit1111

Titan
Moderator


I am an accounting major and I was looking at it more from an accounting perspective as to why the price is so high. I understand everything about things like break even points and inventory I know that they have to sell a certain amount of a unit at a certain price in order to break even and move inventory. Accounting would much rather move inventory at the LIFO/FIFO way of dealing with it and would put all the money to cover manufacturing overhead costs.

The reason we saw an estimated $800 price point and then a real introduction of $650 was to stick it to AMD. AMD says $850, Nvidia says $800. They knew AMD could match the $800 and maybe even perform better, thereby making the 980Ti out-date. So they drop the Ti to $650. Now AMD is up a creek. If they don't command higher performance, they have to take a huge hit on expected profit or they won't sell any Fiji. This is a market war and $650 is an aggressive competitive move from Nvidia's expectations of Fiji. They threw a spiked ball in AMD's court daring them to grab it.

Absolutely it's a competitive move, I'm not disputing that in the least. If NVIDIA can sell more units at a $650 price range than AMD can at a $850 price range then by all means I'm for it.
 
Nvidia is so far ahead in the marketing game. Their top GPU sells for over $1000, and sells, while AMDs top GPU goes for under $300 and doesn't sell.

The $650 price point for the GTX 980 Ti is another aggressive move intended to sell all the cards they can produce while guaranteeing that AMD will make a minimal profit off its upcoming Fiji.

Fiji with HBM and water cooling is a very expensive card to make. Current reports (from Hilbert at Guru3d) are indicating that it underperforms the GTX 980Ti. But here's what will happen, Fiji will end up performing about the same or slightly better as the 980 Ti , at the expense of heat and noise, and it will cost about the same or less, and that will be a disaster for AMD.

Nvidia's price point puts AMD up against the wall. Their card needs to be faster and cheaper, and it simply is neither of those. So they will stretch the GPU's tolerance levels and sacrifice their profit margin to sell some cards. From Nvidia's perspective, $650 is an ideal price point.
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
Yes, this trend of super expensive gpu's needs to end. Nvidia charges a random number that isn't reflective of good price to performance.
I'd rather run SLI or Crossfire than a single $650+ 980ti.

If those benchmarks for the ti are legit, then the cost of two r9 290x's running in crossfire are a much better option and are also cheaper. Would much rather save the cash with refurb 290x's or a 295x2 than a stupidly priced 980TI after that nonsense they pulled with the 970gpu back in april. After all that bs, they still expect all of you to pay a a high price for a single gpu that is outperformed by 2 gpu's in crossfire from 2013.

That = bad deal. Don't buy it, if the gaming community continues to buy these expensive GPU's that offer nothing new and that go obsolete in 6 months, in the next 2 years you should expect $2,000 gpu's to appear.
 
The 980 ti is a good deal.

290x x2 or 295x2 requires what? A 1000 PSU, recommended from AMD, not to mention AMD only way of getting more performance is through increase of heat and power. The 390x is expected to have a TDP of 375. This and it may not even be as fast as the 980 ti.

NVIDIA has money. They can do research to help accomplish this.

Also @Sinty, SLI and CF are sometimes not supported. A single 980 ti would STILL be better for games that don't support those. It is a given. Also, obviously 2 gpus are better than one. That was also given. Not to mention AMD's best GPU, 290x in CF gives off a lot of heat. AMD has its pros, like cheaper products, but AMD did say they did not want to be the cheaper solution. So expect higher prices.
 

g-unit1111

Titan
Moderator


$2,000 GPUs are already here. Have you not seen the $3,000 Titan Z? The R9-295x2 had a $1500 price tag when it was first released.

 

sinty

Honorable
Aug 8, 2012
192
0
10,680
Making the sacrifice to purchase cooling and at least an 850w psu ( 1000w is overkill, you dont need that in a crossfire 290x rig ) is the only option you have for 4k gaming. NVIDIA doesn't enjoy supporting this type of thing and will make you pay through the nose to get it, and in 6 months your gpu is obsolete.

two years later, the 290x still outperformed Nvidia's GTX 970 in 4k needs. Two of these in Xfire are cheaper than the 980TI it seems, and will outperform it by a large factor. So, its either pay roughly $550 for two xfire 290x or one single 295x2 + $94 for a corsair 850w psu and a decent cooling system of your choice, totaling Est. of maybe $750 for all the components needed

or $650+ for a single 980TI that doesn't perform anywhere near the xfire 290x, still cannot handle 4k gaming properly and requires sli to match exceed the 290x's at a whopping $1300 just for the sli cards, tack on the price of the psu for another 100 or so and you are looking at the cost of upwards of $1400-1500 just to beat xfire 290x's haha.

How on earth does anyone justify this? By the way, the 980TI will be obsolete by the end of the year. The xfire 290x's wont be.
 

thehotsung8701A

Reputable
May 18, 2015
35
0
4,530


I haven't really thought or done much research of the 290x. So your saying I can get massive improvement for less price than a GTX 980ti? Then why on earth is so many people rushing to buy GTX 980ti? There can't be that many Nvidia fan boy out there? I expect PC gamer to be more "aware" then console gamer.

I know that the 295x2 is an awesome deal but I didn't want to deal with the hassle of duel GPU in one but the 290x in CF seem like a no-brainer then. I really need to do more research on AMD gpu. Also how is CF? I have never done CF or SLI.
 
@ sinity - 290x CF (or better yet 295x2) is not faster than a Titan X, and the 980ti is the same speed as the Titan X, or very close to it. At least when "both" are OC'ed.

http://forums.anandtech.com/showthread.php?t=2428870
http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-games.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-wd.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-GEFORCE_GTX_TITAN_X-test-mordor.jpg


There are some benchmarks the 295x2 OC beat out the Titan X OC, but on average, it was not as fast. And a single 980ti (Titan X speed), has far better experience than a dual GPU setup due to CF/SLI adding latency, uneven frame times and general game problems from time to time.

Before the 980ti, I would have chosen the 295x2, due to the cost of a Titan X, but a single 980ti is the same cost as a 295x2 or 290x CF, and delivers better overall performance.
 

sinty

Honorable
Aug 8, 2012
192
0
10,680
Absolutely, the benchmarks of the now over two year old 290x outperforms the GTX 970 at 4k by a bit, but in crossfire exceeds the GTX 980 and the new TI version as per what the most recent leaked benchmarks said. People are rushing to get it because they want a GTX card. Yes, there are an insane number of Nvidia fans out there who will buy this card at that insane price tag for no other reason than that its an Nvidia product. Nothing wrong with that, both amd and nvidia are fine and make good products for the most part.

Right now, the games are way ahead of the tech needed to run them. Witcher 3 for example cannot run at 4k without an R9 290x level single gpu at 2560x1440 with most settings on ultra ( minus hairworks and such ). I personally have a samsung hu8550 and I run at the step just beyond 2560, which is in the 3000 range. Nothing on the market can handle this without sli or crossfire. I get single digit fps with a single r9 290x, but with crossfire I get well into the 40s.

Crossfire and SLI are great, you just have to make sure your psu is solid enough to run it, your case is nicely vented and you have some cooling. Most of us have this already, so there is no chance in hell that I am buying a 980TI when the crossfire 290x and 295x2 best it...and will continue to best the next flagship gpu's Nvidia will spit out over the next year that will cost $700 - 1000 for a single gpu.

Nvidia is insane, but the community supporting these insanely expensive single gpu's is even more insane. You guys keep buying it, they keep making them. Hardly anyone is standing up for this price gouging. AMD cards remain strong for 2 years, Nvidia cards drop off and become obsolete twice a year. Two R9290x's are going to best the single 980TI by a fair margin and will remain a powerful rig pairing well into 2016. In a few months, Nvidia will drop a new GPU that is even more expensive than the 980TI and it still wont perform on the level of the cheaper SLI/Crossfire pairings.

You can do well with dual GTX 970s as well. They aren't quite as good as the R9 290x's, but you really cant go wrong with either. Sadly for witcher 3 players, flickering occurs with crossfire 290x rigs. Hoping to GOD they fix this soon.

a single 290x runs witcher 3 at a stead 43ish fps on ultra without hairworks at 2560x resolution. Crossfire 290x's run it buttery smooth at 60+ fps. The benchmark for the 980TI running the same settings is 48fps average. Absolutely horrible performance if that chart is real.

also bystander, there is a plethora of youtube live demos running these gpus in witcher 3 and similar games were the crossfire 290x setup bested the single titan gpu. With respect, the charts you've shown proved the muchhhh cheaper crossfire 290x rig is the much better choice. For the price of just one titan x, you can get QUAD 290x's, or two 295x2's ( four of them...FOUR ) and you can by a beast of a psu and a cooler and wreck the titan...as well as the next flagship gpu's for the next 2-3 years.

So, you are looking at fps performance, I am looking at price to performance. I see your side and didn't do a good job of stating and explaining what I meant. I tried to get the point across the 290x rig is superior by a bit and far cheaper, and in game youtube fps shows the 290x rigs to be more stable. Just my two cents though. Its an unfair chart because Watchdogs and Shadow of Mordor were plagued with among the worst pc performances for gpus ever.

The GTA, COD, Witcher ect ect are the games people should be benching with, not watchdogs and SoM, two of the worst culprits for games that are so poorly optimized for pc gaming and powerful gpu's that its sadistically funny to even think that sli 980s are required for 60fps+ And you still get bad choppy play with huge dips in fps.
 

thehotsung8701A

Reputable
May 18, 2015
35
0
4,530
@bystander – thanks for the result. So the GTX 980ti is still the best?

@sinty - bystander result show otherwise?

I'm upgrading so I can play the Witcher 3 so that issue with the 290x is really bad news for me. Also I need more than 4GB of VRAM because I'm going to be running in 5760 x 1080p and 4GB of ram won't cut it unfortunately.
 

MaxxOmega

Distinguished
May 24, 2011
51
0
18,630

So then don't buy either. You wont get ripped off and you will have your money...
 

andyisvenom

Honorable
Mar 17, 2013
100
0
10,680


Wow are those benchmarks legit? im just baffled at the fact that nvidias newest flagship gpu that is worth more than half a grand, cannot run 1440p at 60+ fps. either games are extremely bad optimized, or nvidia is gimping their gpus on purpose or something.

i just cant believe a 980ti flagship card that is worth $650 dollars, cannot run 1440p, max settings at 60 fps minimum. im truly disappointed.
 


What is baffling? These are the most cutting edge graphical games right now. They are using near maxed out settings. Just because dev's leave in some high end settings does not mean they are "badly optimized". It just means they allow us to use high end settings that push your GPU. In the case of The Witcher 3, they publicized that their target was 30 FPS on PC's and consoles.

This is PC gaming. We get options from low to cutting edge. Learn to adjust settings.
 

andyisvenom

Honorable
Mar 17, 2013
100
0
10,680


most cutting edge graphical games? Crysis would like to have a word with you.

sorry but there is simply no reason why a $650 dollar graphics card can not run 1440p at atleast 60 fps. actually lets not even talk about 1440p, you are seriously going to tell me that its perfectly normal for the top of the pyramid elite level graphics card, to go as low as 41 fps, on 1080P? when these are supposed to be 4K level graphics cards?

whats the point of getting a monitor thats 144hz then when supposedly new best GPU in the world cant even push 60 fps on max settings on a game judging by those benchmarks?

"we get options from low to cutting edge. learn to adjust settings"

If I buy the second best/most expensive GPU I expect to be able to play games with all special effects to the max.

i hope you are trolling.
 

andyisvenom

Honorable
Mar 17, 2013
100
0
10,680


i know that, and thats why you buy the most powerful gpu to be able to run it, but not even the most powerful gpu can do it, which imo is very disappointing, there has to be something else going on here.

so pretty much nowadays if you want ultra settings and a minimum of 60 fps on ONLY 1080p, you have to buy the top of the line card? is that whats its come down to? but wait, not even the 980 ti can get a minimum of 60 fps on ultra on ONLY 1080p!? cards that are for 4k? that is insane.

no excuse, unless someone can provide me with some legit facts as to why this is happening, i believe either developers dont care anymore about optimizing, or nvidia is gimping their gpus.
 


You seriously don't understand how these settings are created. Ultra in 1 game does not mean the same in every game. These are just labels the dev's give a particular set of settings. Within the game engine, there are millions of possible settings. They just adjust all the different options within the game from a near limitless amount of possibilities, and presto, we have Ultra, or High or what ever else strikes their fancy. I recall a number of games, such as Crysis 2, which had all kinds of hidden settings that could be adjusted with mods. Would you have considered Crysis 2 horrible optimized had they let us at those settings without a mod? These are completely arbitrary. The dev's give us the options they think we might like.

Crysis was the most graphically demanding game of its time, but it's been years since it has been released. It is no longer top dog. It hasn't been for some time now.

Just because the dev's put in settings that are demanding, does not mean they aren't optimized. It just means they let us gamers choose high end settings not normally available.

There are probably 100's of games which will drop you to 41 FPS minimum at one moment of a game or another. That is completely normal.
 

andyisvenom

Honorable
Mar 17, 2013
100
0
10,680


sorry but if you think that dropping to 41 fps minimum on 1080p on a $650 dollar card made for 1440p and 4K is normal, ive got some bad news.

4k is out for 2 years now and here we have a brand new top of the line elite graphics card costing an arm and a leg and you are happy to reach 60fps while dropping to 41 fps at times @ 1080p.

We should boycott both amd and nvidia for producing these disappointing products.
 


You are completely missing the point here.

Would you rather the dev's of some of these games hide the highest end settings, and label some lower settings as "Ultra", never giving you the option to use them, or would you rather have the option to use them, letting you pick and choose a mix of settings?

Some dev's might hide the high end settings, but it does not give us any better performance to IQ, it just removes the ability to see the eye candy. Some people are going to turn up the eye candy and play without AA, as a solution. Others might turn down shadows and use some AA. They just gave you the option to do so, rather than take all the high end settings away, so you feel better about your PC.

And the minimum FPS of games is most often a result of the CPU, and its inability to handle the number of draw calls needed in an area. The minimum is not often about the GPU.

And to my preference? I'd prefer to have all the highest end settings available, and I'd choose which ones are more important to me, and turn down others, so I reach 75+ FPS 95% of the time. I'd like those settings because I might play the game again in a couple years after an upgrade. I might just mess with them to see just how impressive it looks with everything turned up, or I might even buy a 2nd 980ti and use them now with high FPS.

Someone else might not have the high FPS desires I do, and would play at 30 FPS with everything turned to max. These options are there because the dev's did not hide the settings.
 

andyisvenom

Honorable
Mar 17, 2013
100
0
10,680


no i understand your point actually, i am completely okay with the fact that they do not hide the settings, i actually praise them for that. What i am disappointed is the fact that a top of the line card struggles to maintain a constant 60 fps minimum on 1080p,with those settings, meanwhile the card was supposedly designed for even more demanding resolutions with more than double the pixels like 1440p and 4K. don't you think it has something to do with optimization? there has to be something more to that. either the devs are not optimizing right, or the card makers themselves are slacking off. maybe both lol

i will admit i wasnt aware of the minimum fps having to do with your cpu though, however im sure a 4690k would not bottleneck a 980 ti, i would hope not, cause if you have to buy a $350 dollar i7 cpu which is supposedly overkill for gaming as they say, on top of paying almost $700 for your gpu, then im seriously considering going console haha, jk but it would be such a disappointment.

i found this video after a couple youtube searches: https://www.youtube.com/watch?v=yGhAWHGMhPM

, his minimum fps at 1080p with everything maxed out i think was 61 fps, but according to what you are saying , this is thanks to his 5960x? a cpu that costs almost double what the card costs? so to get a minimum of 60 fps with witcher 3 on ultra settings on 1080p i need to spend almost $2,000 on the graphics card and cpu alone? im not saying you are wrong, though i hope you are, cause if you arent then oh boy we are doomed for sure.

this further proves that something else is going on here.
 

TRENDING THREADS