As usual Nvidia has nothing good to offer!

Status
Not open for further replies.

COCAYven

Honorable
Jul 9, 2013
35
0
10,540
I am truly honestly trying to buy an NVIDIA GPU.

I am really trying hard

Every time there is a decent show like IFA or CES, I try to go or at least follow most press conferences and follow up on the new tech or releases.

It is now since 2010 that i simply couldnt switch to use NVIDIA products there ws aways other solutions that did the job way far better and way more convenient from a wallet and amortization point of view on a 5 year life span.

Having said so I must now admit that after this 2015 CES 780 Ti and Titan card have become very valuable to Me, for the simple reason that the new 900 series suck compared to them.

I might maybe switch to a couple of 990 if they will come out anytime soon and have any double precision power that is better than old kepler consumer stuff(which by the way works much better than the professional stuff)
If You ask me why ... My buddy here at my side has tesla k6000 and titan, and titan outperforms all the pro cards when it comes to raytracing, hes running on 32 threads.

*(HyperQ matters here.HyperQ apparently is enabled in all new maxwell gpus, this is why You get such a better performance if u running multi threaded cpus , even if the GPU chip itself is basically a native bad born kepler chip.)

Anyhow.......This is just simply due to the pro cards having a clock rate that will lower once the cards get hot, while You can control the fan speed on the titan card You cannot do the same on the pro cards thus making the titan outperform the pro cards even if all the hyperQ lanes are not fully enabled.
Liquid cooling would most likely solve the issue however Im pretty sure that You would all agree that You certainly not willing to give up on warranty on over 8k worth hardware.

On the heavy stuff we use this http://amfeltec.com/august-2014gpu-oriented-pcie-expansion-cluster-from-amfeltec-is-in-production/ lots of gpus lots of noise but yeah cuda is cute sometimes if u can set it up taiwan style with old stuff that costs nothing. We dont even care for the dust or to clean that .




In the year of 4k at 60hz, where 4k is affordable even at true cinema 4096x2160, Nvidia as usual has NOTHING to offer at CES but pathetic tegra chips for cars ,in a world where intel doubles its computing power every 13 months.

Nvidia has nothing more than getting High end kepler cards out of the market to promote 900 series trash!*(wont comment on their tesla k80 for WW1 workstations) thus cheating their shareholders.

Good lord nobody is even buying Iphones any more, no one will spend a dime on any tegra chip!

Even Sony with their xperia phones do better than NVidia when it comes in dealing with 4k content.

4k is NOTHING new, We have been working on 4k content for over 2 year now and have some collegue working on 8K currently too!

So why NVIDIA HAS NOTHING TO OFFER FOR 4k 2014/5 clients?!

There is NO sense at all to have two gpus to be able to run one single monitor...even less to spend more money for any gpu than a monitor or EVEN WORSE your whole computer itself!.

If LG set the price for a true cinema 4k 4096x2160 monitor at 1300 dollars any upcoming replacement for gtx titan z should cost at LEAST half the price of the monitor any pricing above that just proof that nvidia has nothing to offer for current technology available in the market.

For those of You who didnt understand I will repeat.... if the most expensive monitors costs 1300 dollars and the cheapest costs 500 dollars, the single gpu running it should cost from the lowest end one 250 dollars, to the top end one 650 dollars not a DIME MORE!

If You cannot produce GPus at these prices for mass then You SUCK!, Your company sucks, Your project manager sucks, Your software engineers suck and You as CEO suck too and should be removed for incompetence.

So In few words You should be able to have for example a gtx 960 which is a a low end acceptable card for 250 dollars yes BUT it should be able to manage 4k ALONE! While the top end card EG a Titan maxwell should cost 650 dollars and should offer the maximum computing power possible on a single gpu ON 10 BIT colour depth not only in direct x but on a 10 bit professional monitor, while a TITANz 990 instead should offer a lower tdp solution for dual gpu computing capability, where space for example in a workstation is very very limited.*(and yes cost 30to 40% more since sharing same PCB, not 3thosuand dollars to then lower ur undies and try to sell at 1500!.

Even a simple gtx 980 which is fairly an expensive GPU should be able to manage at least common fake 4k 3840x2160monitors running at 60hz at maximum setting offering at least 45 FPS in the most intensive and heavy benchmark test or game.

PAUSE....You run 1080p???? Then You dont need more than a 750ti or even a fashion victim mac laptop running on bootcamp so go read some other post, this is not relevant to You.

8k and DP1.3 are behind the corner, there is no way pascal will be launched in 2016 as maxwell hasnt been launched early 2014 as on Nvidia s roadmap....Im certainly going to enjoy Intel skylake xeons by then as that is My next CPU upgrade...

It was bad news already for professionals to see the Quadro refresh offer old kepler bad born, crippled gpus which are not double precision enabled on most of the refresh line.

All this means to me that by the time Nvidia will have something decent to offer for modern current hardware, we will have so much computing power by intel that we will NOT need gpus at all anymore.

You guys better stick to old stuff even if used, if You can find it ....and keep Your money tight in the pocket , there is absolutely NO convenience at all to buy 980 or 970 cards, if you need to run more than one gpu then the old stuff is a way far better deal with way more computing performance. You never know You might eventually need that!

If it comforts You NO one is buying gtx 980s NO one is buying gtx 970 either. This why they have not released new titan cards as old ones are still abundantly available around.

People sticking to their old gpus and are just waiting for single 4k solutions that wont cost more than the monitor...thats a fact!*especially considering that most of these monitors are 10 bit displays.
What You gonna do sell pro cards to gamers? or cripple double performance on GTX as You just did on Your new MONSTER GPus?!...ehm AMD might not conform to Your solutions.....

better hold fast Jen...
.....time to wake up Jen, with nasdaq at an historical high, oil at a low, interest rates under Your feet, future doesnt seem too bright in this moment for You!

Right!!!.....made in taiwan.
 
Solution
1. First reason benchmarking with games cannot be accurate benchmark as I run T4 internet at 300 mega, and definitely would get around 35% more FPS minimum than someone running internet at 10 Mega.(some providers already offer over 1 Gyga internet speed here in europe).
-- FPS has nothing to do with your bandwidth. Even online games rely VERY LITTLE in your bandwidth, it's your latency, the ping.

2. A GPU should definitely cost half of its corresponding matching monitor AND give all the power it needs in the current most intensive graphics tasks at the monitors maximum capable refresh rate....if not it is simply trash and nowhere close to be a monster.
--these are 2 different products, how can one be half the price of the other? they...
titans are not really "cheap", even if one could get an affordable (relatively speaking as they still cost a lot) 4k monitors, they would need a very powerful gpu to run them
maybe they don't have anything to offer since you can buy titans. for most of us, we are thankful that they gave us the 970, so they do have something to offer well ahead of these events
 
The problem isn't that the GPU creators haven't been making better GPUs, they have. They have been doing it because there is demand for faster cards, you know why that is? Even though the screen standard has been 1080p for like 8-10 years why have the cards steadily gotten more powerful and the average FPS on new games hasn't gone up? It is because the graphics engines have gotten better, these people have been making 1080 look better and better every year, now these 4K monitors come out and to render that 4x increase in pixels takes 4x the GPU power, but the current game engines already push these GPU to the make to get 80-90 FPS and you somehow want Nvidia to pull a 4x More powerful GPU outta their arse in under 2 years, when has any GPU or CPU quadrupled in performance in under 2 years? (Need to update this, in the last 1-2 decades?)
 
Oh forgot to mention that the Graphics engines aren't optimized for 4K either, they tend to work towards 1080 which is the standard that is in use. Also how long did it take 1080 to become the standard? I remember them being around in the earlier 2000's didn't really see them commonly until the mid 2000's, hell I still don't even know a single person that has a 4k TV or Monitor. It's not a standard yet, don't think things just magically change the second someone brings something new out. We've been pushing the boundaries of conventional CPU for quite a while now and I feel like GPU are getting there, they keep on doing more unusual things to get more performance out of the new cards. 4 years ago someone would have laughed at you if you said the new flagship GTX was going to have a 256 bit memory bus, but the new one does now and somehow it's still much quicker then it's predecessor.
 




Im running Firepro 9000 so i dont really need any titan however, currently i can find 780ti at 400 euros which is very appetible to stick in our gpu cluster and keep it there till it dies.
However the thread is about 4k.....GTX970 is definitely very expensive card even if it costed 100 dollars, for running 4k cause it simply wont handle it if u gaming...and the performance standards are taken from intensive games benchmarks and intensive processing programs....its a dead brick,
this is where the old kepler cards are way much better than these new trashy cards.
 


 
1080 19:9 full hd as u call it a standard never required any super GPU in the first place...However people buying expensive GPUS never had and will never spend a dime in 1080 monitors. they would at least have a 24 inch 1200 deep monitor to 30 inch.
Some conformed to 2440x1440 due to the ratio pushed by full HD 19:9 ratio.
But 1080p is dead, this is its last year, ....by 2016 whoever will buy anew monitor will buy 4k and once DP1.3 and hdmi 3.0 will be common You will regret u bought gtx 970....what if in september u will see hdmi 3.0 4k monitor at 300 bucks and you just bought ur gtx970? would You still be grateful as You are now?....I definitely wouldnt if it was Me.

GTX 970 980 are veyr welcome for lower tdp, but Nvidia to make the gpus really wow should have had also HDMI 3.0 port for them and get rid of DVI. They didnt so means they just trying to suck blood from clients who they already sold to.
Trust me if You are agamer stay away from those cards, if You are a content creator too...and stay away from them if teh next ones come again with DVI and no double precision and NO support for 10 bit displays...cause You will NOT use that card much unless You bought it for ur 12 year old kid.
 
Hey,

I really don't know where to start but your entire post makes very little sense...

1) For example, stating what a GPU should cost based solely on what an HDTV costs? That indicates a severe misunderstanding of how product pricing works.

Also suggesting a card should perform well in 4K gaming just because we have 4K monitors doesn't make sense.

2) You then talk about how the nobody is buying the GTX970 and GTX980 cards when they are in fact amazing cards that are selling extremely well...

3) No need for more than a GTX750Ti for 1080p resolution in games? Uh... no.

We have things called "benchmarks" you should investigate. For example, try running Far Cry 4, Crysis 3, or any demanding game at max settings, 1080p on a GTX750Ti and you won't get 60FPS usually.

4) "the old stuff is a way far better deal with way more computing performance..."

Uh... what?

*I have absolutely no idea where you are coming up with most of these "facts" because most of what you say is either incorrect or doesn't make sense. There are certain Titan cards that may have poor value but as a consumer you have choices.

As for gaming, a GTX970 is hands down arguably the best card on the market for value in high-end gaming.
 


The "mate" didn't give it away, sad, oh well talking about here where I live in Australia 😀
Also 8-9ms response and isn't rated to run in the conditions my house was at 2 days ago doesn't seem like a good idea. Also considering something like 25% of all males have some level of colour blindness I'm supprised we worry so much about IPS over TN panels 😀

 


a few things:
"cheap" is relative, a 4k worth $1300 is not cheap for most people.
and, gpu price has nothing to do with monitor/tv price.

also, i don't like Apple, i really don't. but to say that nobody buys Iphones anymore is really just you ranting, how many people bought iphones for the last few years?

no one is buying 970's? ask the people here how many have them, i know several of my friends who bought them. check the forum here, how many threads are about 970's?


 
The reason they're not working on any viable 4K ready cards is because nobody really games on a 4K TV.
4K monitors are beyond most people's budgets, and last I remember, NVidia makes the least money on their low and enthusiast cards, profit being more on mid and high.

AMD is moving forward with their high speed memory modules, so think about them.
NVidia is currently dominating the market and in the end, all they want is the money.
 


I dont think You should wear an Nvidia tag as Your knowledge about this matter judging by this post is very poor, as for benchmarks and power draw, they are the evidence of gtx 900 series being poorer to kepler card, as they power draw is higher while the overall performance is barely similar.
 


You can easily find 4k monitors costing less than 1440p. and by spring much probably around the 300Dollar mark.

As for today You can find 4 k monitors of excellent quality running at 60hz orbiting 380 dollars.

1080 will die this year for those people who actually buy gpus, and they will retail to people who dont buy gpus and stick with their computer specs at origin.
Unless of course You want to buy a gtx 980 to use with a 99 dollars 1080.

This post is NOT about 1080p, it is about 4k monitors and unavailable gpus to run them with with decent performance and a 10bit colour depth.
 


A gtx 980 draws massive much more power than a titan or 780ti at load ,to equal more or less their performance, if you will ever to compare a gtx780ti kingpin which will draw at load as gtx 980, you will easily see that it annihilates newer maxwell cards.

900 series and high end kepler will handle 4k more or less the same....however this is NOT the point....the point is that NVIDIA did NOT announce any GPUs that can manage 4k with MINIMUM close to 60-100% increase in performance to kepler. NOR announced for cards to come to be 10 bit capable.(10bit direct x which is what current gtx support is not enough in 4k age).

All of this when 900 series should have been on a different die and released JUNE last year.
HELLO wake up, Nvidia is at least 1 year late!
 


That kind of sums up this thread! :pfff:
 
There are some people who do not understand, why a gpu should NOT cost more than half the price of a monitor, so I wish to address this clearly, one point at a time.

This post is about 4k monitors and I will take as reference what is currently the best professional monitor, which is FLAWLESS , supports 60hz gaming, 10 bit colour depth ESPECIALLY and has TRUE 4k resolution at 4096x2160.

The Monitor taken as reference is:

LG Electronics 31MU97-B, 31"

Which currently retails between 1100 to 1400 US$.

The life cycle of a 1 billion true colour true cinema display panel like this one is MINIMUM 5 years, before You would ever decide to change monitor for newer tech. (eg 10 bit OLED at 144z or much higher refresh rates which are likely by then)
The life span instead of eg a titan 2 gpu which might be available for retail around may this current year is much much less, and it has to be taken into account that a GPU like that in 5 years time would definitely be a BRICK way sooner than later.
Also, some people have mentioned benchmarks, and to Me the ONLY true benchmark which matters is Unigine when it comes to FPS.

This is for a few very simple reasons.
First reason benchmarking with games cannot be accurate benchmark as I run T4 internet at 300 mega, and definitely would get around 35% more FPS minimum than someone running internet at 10 Mega.(some providers already offer over 1 Gyga internet speed here in europe).

The second reason is that since Unigine benchmarking is extremely graphic intensive, If I have a 4K monitor that can output 60hz I would need at least a HIGH END GPU that can give me at LEAST comfortably above 60FPS *(while a low end eg GtX x60 below or GT cards,min 30fps), as anything below that would mean that the technology of the GPU being tested is OUTDATED compared of the monitor in question that the benchmark is being run on. This doesnt mean that the monitor is above standard tech or something from another galaxy, the monitor in question is CURRENT very affordable High end piece of equipment that anyone can buy easily without needing to go find some secret dealer in some town in the other side of the planet. in few words NOT an early adopter piece of tech, like OLED or 8k would be for example.

So it is NOT hard to understand that for example when the new version of titan z will hit the market, we can comfortably say with confidence that the so called future MONSTER GPU will already be a brick compared to current technology available right from the moment of its release.

Some of You might say ok, but what about sli...

Sure....sli will solve the problem , however if i go dual GPU , it will becuase Il run dual monitors, if I run 3 gpus it is because Im running 3 monitors, if I am running 11 GPU (which is what we currently do from time to time it is because We are rendering intensive scenes, in a gpu render cluster).

So I am not exaggerating and claiming that a graphics card should at least be able to run 3 4k monitors, but it is reasonable to ask that for a certain price tag the piece of equipment that I am being asked to pay for would NOT at LEAST support anything less than excellent performance on one monitor!

Now lets take also into account softwares Like chaos Vray, there is NO doubt that 5 years from now Vray of this year will definitely be super obsolete.
So is we want ALSO to take into account the computational power of any given gpu, We have to take into consideration that while nvidia average offers a 35 % increase in performance on average every 2 years, Intel offers 100% increase in performance every 13 months average.

So for some of You this might be meaningless but for people who are into need for computation and processing, it is easy to understand that a software like vray that now supports Intel EMBREE , easily improved their render times by 30 ,35% average on its first release.
The math is easy, 2 years from now the best intel CPU will perform 400% faster than the best current CPU.....

in 5 years time...which would be the minimal life span of My 1200US$ monitor Nividia would give me maybe a 100% increase of performance, while intel would give me 1600% increase in performance.

Now lets also take into account softwares like Autodesk Maya or 3dsMax, that release by release improve what the can do with relatively low price tags compared to engineering softwares that can cost easily up to 40 thousand dollars licenses....in a 5 year life span people using stuff like that are constantly year by year, release by release having the GPU NOT up to the task of what the software they payed for, can truly accomplish.


We want to mention games?

How many games do You average buy each year? and how long will a title be good for, before a new release with enhancements?
18 months? 24 months?
What will crysis be like 2 years from now? do You really expect that 2 years from now, a for example NEW MONSTER GPU TITAN Z 2 will be up to the task for it?

Well....It has never been, it is not happening now, and when we will see a new GTX 990 at eg 1500 dollars price tag we have all the reason to ponder if that gpu will at all be up to the task for giving You 60 fps in graphic intensive game in 4k with high resolution textures.......

All of this,while at the same time DP1.3 and HDMI3 will be offering You 144hz and You finally decided to trash Your 1080p for a 450 dollar 4k 10 bit colour 32 inch LG or dell monitor.
Then you might find telling to Yourself "Uh oh.... now i have to change MY GPU too...anooooother 1500dollars?!""anoooooother 500 dollars?!"
"anooooooooooooother 300 dollars?!"

So the reasons for a gpu to cost MAXIMUM half of the monitor it can support if it is even UP TO the task, are not logical but extremely plausible to expect from a company that its core business is to offer graphics processing units.

While constantly being outdated on every single new release, NVIDIA happily enjoys common people posting in forums, about this monster GPU that monster GPU.....
I am not anyone to say what people should do, but ideally instead of saying that some GPU is a monster GPU, people might just better start saying that some new stuff is NOT up to the task and a bad release compared to what the market has to offer out there.

A GPU should definitely cost half of its corresponding matching monitor AND give all the power it needs in the current most intensive graphics tasks at the monitors maximum capable refresh rate....if not it is simply trash and nowhere close to be a monster.
 
1. First reason benchmarking with games cannot be accurate benchmark as I run T4 internet at 300 mega, and definitely would get around 35% more FPS minimum than someone running internet at 10 Mega.(some providers already offer over 1 Gyga internet speed here in europe).
-- FPS has nothing to do with your bandwidth. Even online games rely VERY LITTLE in your bandwidth, it's your latency, the ping.

2. A GPU should definitely cost half of its corresponding matching monitor AND give all the power it needs in the current most intensive graphics tasks at the monitors maximum capable refresh rate....if not it is simply trash and nowhere close to be a monster.
--these are 2 different products, how can one be half the price of the other? they have different costs. R&D, manufacturing cost.

if game being released is too heavy for current hardware, should the gpu and cpu be half the cost of that game? Crysis, during its time was too heavy even for high end gpu.
Also, we must remember that there are other factors in FPS, not just resolution. It also depends on the game, will we need a monster gpu to run super mario (if we could) in 4k?
 
Solution
Status
Not open for further replies.