Nvidia Unlikely To Unveil 2018 Graphics Cards At GDC, GTC

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

...as someone who works with 3D CG I have no interest in purchasing a card that has been run at "torture mode" for months and months. I've read where these rigs are not always in the best operating environments which puts more stress on the components, particularly the cooling system. The last thing I need in the middle of a large heavy duty render job is for the fans to crap out and the card cook itself to death.

There's a reason dedicated mining cards have a shorter warranty, it's not just concern over a bubble bursting.
 
I buy older PCs (DDR3 varieties), make reasonable upgrades, add a gaming GPU and give them away to friends. At one point I was getting GTX 980s and 1060 3GB models for about $200 used. No more, so no more free upgrades. I did 6 last year. All went to good homes!
 


...Nividia has been moving more and more towards high end compute and graphics solutions like engineering and scientific workstations, deep learning and AI, supercomputer development, and smart automated systems (like self driving and smart vehicles). They have more than enough interests to keep the big zlotys flowing in.

Notice that on their site, all Quadro and Tesla cards along with Titan-Vs are still readily available (Quadros with a 5 card per purchase limit). My thinking is production of these is getting priority over their consumer grade cards. Miners aren't buying them because of the high prices, even though they are actually better suited for the type of compute operations mining involves. Imagine having 640 Tensor cores (Titan-V/Quadro GV100) at your disposal.

Actually with the prices I am seeing some 1080 Ti's listed at, for a couple hundred more one can get a 16 GB P5000. OK not the best for gaming, but for 3D enthusiasts pretty much all VRAM most will ever need.
 
Well at least my 1070 won't be going obsolete as soon as I thought. lol. It's probably smart strategy to hold off while RAM is so high priced and they would have to charge more. Not that the problem will be fixed by then but at least it won't add to the crazy demand for a while and maybe they know something we don't about what the market will be like in July or August.
 
Knowing these guys they'll release a pascal refresh or multiple gimped architectures and milk the situation even more

Funnily enough if companies actually built PS4-tier systems with the GPUs right on the board they'd save us money, only problem is to find some proprietary method to stop mining on it.
 
Has anyone heard when memory availability - to enable higher GPU production rates - will finally be plentiful again?
 

Tensor cores are for doing [4x4]x[4x4] matrix FP16/32 multiplications mainly for neural network training and are useless for mining where hashing algorithms are mainly purely in-sequence INT64 add/mul/shift/rotate/XOR, little to no chance at massive parallelism whatsoever, even more so if your aim is set on memory-bound alt-coins like Ethereum where memory size, memory bandwidth and memory latency are the main bottlenecks instead of processing power to impede ASIC domination - doesn't matter how fast your GPU or ASIC is if it doesn't have enough RAM or RAM bandwidth to achieve its full speed.
 

Samsung and others have begun building new chip fabs but those won't be online for another 2-3 years. Silicon wafer manufacturers are also over-booked and I have little doubt that other suppliers of wafer manufacturing/handling equipment are on back-order too.

So my pessimistic estimate is as late as three years from now. My optimistic estimate is no sooner than late-2019.
 

...nice.

Yeah I came into my second one that way with a 750 Ti, Sufficient for small GPU render jobs but also has 32 GB of DDR3. As I also work in Carrara I can network both my systems together for rendering and have total of 20 CPU cores and 56 GB of memory.
 


Sorry to disagree with you but this article is not bs. Ofc THW would not disclose their sources but I bet is not just some crap found on other web sites and republished here as is the case with lots of other hardware news web sites.

One more reason I think this article is useful information is that I was above to delay the build of a new PC till april hoping to see some new products from Nvidia after GDC/GTC but it seems those new products will be here in august at best and I won't postpone that much my build.

Cheers
 


I think most of us in the future will be using systems with integrated graphics or systems where the descrete GPU is soldered to the motherboard like on laptops and Steam Machines.

If miners continue to drive drive up the price of GPU's, AMD and Nvidia will eventually increase their prices which will make pre-built systems and bundles expensive. Right now, AMD and Nvidia aren't the ones setting the prices. The resellers are setting the prices far above MSRP due to demand.
 

...that may work for games but not CG rendering. Engines like Octane and Iray specifically make use of high CUDA Core (Stream Processors in AMD terms) counts and VRAM.When shunted to a CPU and system memory, rendering performance degrades significantly due to slower bus speeds and fewer processing threads.

The Ryzen-5 2400G only has in the neighbourhood of 700 stream processors (slightly more than my Kepler 750Ti) and 16 ROPs (same as the 750 Ti). A 1070 (considered the "optimal" card for GPU based rendering) has 1920 cores and 64 ROPs.
 

Source? Samsung has multiple new fabs/expansions in the works as far as I can tell, but these costs billions of dollars and time to build. The GPU shortage has been going on for less than a year (and the most recent, most severe shortage only a couple months). They can't fix supply issues overnight.
 

...very true. Many years ago when a major plant was destroyed by fire it took around a year for the situation to recover throughout the supply pipeline after production was restored.

However, even when this is overcome, we still have the matter of mining which is the primary cause of the recent more serious GPU card shortage. Hence, prices may moderate a bit, but I don't see them coming back down to where they were when say, Nvidia first introduced the 10xx series.
 
Why the press is hyping us first and then pull the rugs under our feet?

I hate that. Report something confirmed and stop spreading rumors!!!
 


If they don't; then I'm not upgrading. If current prices become the norm; I'm going to console.
 
So let's talk again in July 😉

If I write such things, I know what I do :)

 
Status
Not open for further replies.