Nvidia Volta Megathread

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I don't think it's about nVidia acting "defensive" or not. Their long term plans might shift a little bit, but their current main goals won't. That is why i believe they'll focus more in AI stuff and bring modest improvements to their GPUs.

For better or for worse, the perf/watt advantage they have over AMD is still bonkers. The leap they took with Maxwell is just bananas, so they can sit on Pascal for a gen or two, just like they did with G9x gen (8000 series). They've done it before, they can do it now comfortably.

Cheers!
 
and then they got hit really hard when HD4800 arrive. nvidia is not the only company that can pull the unexpected. another example is the low level API thingy. even right now nvidia still did not neutralize the advantage AMD can get from low level API. good thing for nvidia their architecture is very power efficient and they can brute force the performance. now intel is involved directly to disrupt current status quo between AMD and nvidia things could end up very nasty for nvidia. although intel is in the hot water themselves right now.
 


What I'm talking about is very simple. It might be an oversimplification, but I don't think it's that off the mark either: the difference in GPU uarchs between nVidia and AMD currently, is like Bulldozer and Sandy+. AMD is taking a stubborn stance with GCN and until they don't revisit that, nVidia will keep the advantage. What you mention with the 4800 is taking the lead in perf/watt, but not the actual performance crown; that remained with nVidia until the HD7970 with GCNv1. We all mocked nVidia for it at the time, but now it is reversed. AMD is taking the stubborn approach and nVidia is just relaxing.

So, all in all, i don't think nVidia will halt all GPU development (R&D) per sé, but we won't see leapfrogging. Or at least, I don't expect it until AMD makes a comeback.

Cheers!
 


Yeah, but look at this wicked fast HBM2!
Samsung brands second-gen HBM2 as 'Aquabolt'
Now in mass production
By Shawn Knight on Jan 11, 2018, 12:40 PM

https://www.techspot.com/news/72712-samsung-brands-second-gen-hbm2-aquabolt.html
Samsung on Thursday announced it has started mass production of its second-generation 8GB High Bandwidth Memory-2 (HBM2). The package, dubbed Aquabolt, delivers data transfer speeds of 2.4 gigabits-per-second (Gbps) per pin which Samsung says translates into a performance boost of nearly 50 percent compared to its first-gen offering.

Samsung’s first 8GB HBM2 package offered a transfer rate of 1.6Gbps at 1.2V (and 2.0Gbps at 1.35V). Aquabolt gets the job done at 1.2V.

The new package will offer a 307 gigabytes-per-second (GBps) data bandwidth, achieving 9.6 times faster data transmission than an eight gigabit (Gb) GDDR5 chip, which provides 32GBps of bandwidth.
As Samsung highlights, using four of the new HBM2 packages in a system will enable 1.2 terabytes-per-second (TBps) of bandwidth.
Samsung notes that achieving Aquabolt’s unprecedented level of performance required the use of new technologies related to Through Silicon Via (TSV) and thermal control.

A single 8GB HBM2 package, for example, consists of eight 8Gb HBM2 dies that are vertically interconnected using over 5,000 TSVs per die. Using that many TSVs can cause collateral clock skew although Samsung was able to minimize the skew and boost chip performance in the process. To keep temperatures in check, Samsung increased the number of thermal bumps between the HBM2 dies and even bolstered the package’s overall physical strength by adding an additional protective layer at the bottom.

Samsung didn’t say when it would be supplying the new memory to partners although if it’s already in mass production, one can hope that partner availability won’t be too far out.
 
DVixkABV4AAaqAP.jpg

https://globenewswire.com/news-release/2018/02/08/1336634/0/en/NVIDIA-Announces-Financial-Results-for-Fourth-Quarter-and-Fiscal-2018.html

Gaming still the biggest earner for Nvidia, but the Datacenter earnings more than doubled.
 


An April release would follow last years pattern 2000 series here we come!
 


Could NVIDIA block GeForce GTX 20 series cards from mining? With cryptocurrency mining eating up GPU supplies around the world, could NVIDIA block miners on the GTX 20 series? By: Anthony Garreffa | Video Cards News | Posted: 2 days, 14 hours ago
https://www.tweaktown.com/news/60812/nvidia-block-geforce-gtx-20-series-cards-mining/index.html

I think they would be shooting themselves in the foot to do this, based on the sales numbers. But it would be great for people who just wanted to game.
 
Sorta, see thing is is that even Nvidia is annoyed a little bit by miners because they are not long term customers. Once the crypto mining bubble actually blows, they'll just sell the cards on ebay.

Where as the normal gamer will upgrade his card every 3-5 years and typically it's the same brand because of brand loyalty.

At least that's what I've heard.
 


I think everything Nvidia is "saying" is in attempt to retain the gaming clientele with a perception of trying to to help gamers while still capitalizing on the profits. There are numerous ways they could have made it possible for gamers to only get video cards if that was their intention. Holding more inventory at their site and offering programs/verification to ensure that cards were going to gamers, and not pallets being dropped off at farms. They could have people log in and verify identity take a number and wait for a card to be available at retail price. Their lack of doing anything other than making statements in defense of gamers is what is most telling about their efforts.
 
Bummer man. I'd wait for Ampere LOL.

Speaking of, I'm going to be buying a GTX 2070 or GTX 2080 when they come out, kinda excited!

Reason why is because I've been having annoying temperatures with my GTX 1080 AMP! Edition, runs really hot, hot enough that I can't overclock at all and the card drops down to it's physical boost clock, not even GPU Boost 3.0. So annoying, so I'm gona sell the 1080 soon and grab a GTX 2070 or 2080 when they come out.
 
I know it's probably super close to the next release, but I have to say the Evga 1070Ti SC exceeded my expectation. I'm able to overclock RAM 700MHz and processor 200MHz, and it runs at 63 degrees. The cooler has 3 copper heat pipes running the length of the heatsink. 1 8 pin connector with a 180W TDP. This card chews through anything at 1080p on ultra settings! EVGA offers a 5 year and 10 year extended warranty with this card for an additional $30 and $60 respectfully. It is without a doubt the best made card I've ever owned, so no buyers remorse here if they come out with an even more amazing card. But I always have the option to sell this card on Ebay for nearly twice what I paid for it if I can get a deal on one of the newer cards.
 
Just heard something interesting. They said nvidia will soon release up to three seperate architecture at once to address various market.

Volta: optimized for HPC and Deep Learning application
Ampere: Gaming optimized
Turing: made specifically for crypto and blockchain

Now the interesting part is ampere probably will have it's mining capability being gimped by nvidia be it on driver or even hardware level.....

Now let the speculation and twist begin.....😛
 
BITNAND prepares NVIDIA P106-090 6GB mining card, costs $389 BITNAND will have their NVIDIA P106-090 Mining Card available late February, costs $389 By: Anthony Garreffa | Video Cards News | Posted: 3 hours, 3 mins ago
https://www.tweaktown.com/news/60895/bitnand-prepares-nvidia-p106-090-6gb-mining-card-costs-389/index.html?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=tweaktown
60895_03_bitnand-prepares-nvidia-p106-090-6gb-mining-card-costs-389_full.jpg

The new NVIDIA P106-090 Mining Card can be purchased directly from BITNAND for $389, with individual card performance varying from batch to batch. BITNAND notes that the "overclocked performance is not a guarantee and will depend on individual card's performance". This is what BITNAND are promising, but these results will vary from card to card slightly: 22MH/s ETH ±5% 500H/s XMR ±5% 70W ±5% Power Consumption
60895_03_bitnand-prepares-nvidia-p106-090-6gb-mining-card-costs-389.jpg

As for the technical specifications of BITNAND's new NVIDIA P106-090 Mining Card: Graphics Processing NVIDIA P106-090 Core Clock Boost 1531MHz / Base 1354MHz CUDA Cores 640 Process Technology 16 nm Memory Clock 8008MHz Memory Size 6 GB Memory Type GDDR5 Memory Bus 192 bit Card Bus PCI-E 1.1 x4 PCB Form ATX Power Connectors 8pin ATX (12V) TGP 75W No DVI or HDMI output

Edit: https://www.bitnand.com/product-page/nvidia-p106-090-mining-gpu-card
 
Status
Not open for further replies.