News Micron Confirms RTX 3090, Will Have Over 1TBps GDDR6X Bandwidth

danlw

Distinguished
Feb 22, 2009
137
22
18,695
Flagship graphics cards are like luxury yachts. They aren't meant nor expected to be bought by everybody. They are mainly bragging rights for the manufacturers, people who are well off, those who want to be on the bleeding edge, and those who are very poor (because they make large purchases on credit). For the rest of us, there will be the 3080, 3070, 3060, and probably several other variants for people who can't afford the crème de la crème.
 

thGe17

Reputable
Sep 2, 2019
70
23
4,535
The Micron PDF obviously confirms nothing!
Take a closer look at the table; it only contains sample data for advertisement purposes. Column three combines specs for the Titan RTX and RX 5700 XT. The specs do not match either card. Additionally the same problem for column six. Six HBM2E stacks but a capacity of 16 - 32 GB. That's simply not possible and therefore the table data cannot be taken seriously and therefore it shouldn't be taken as a preview on Ampere. More likely the table contains example configurations and does not have the intention of representing exact card configurations. It should only demonstrate Micron's GDDR development.
 
Last edited:
The Micron PDF obviously confirms nothing!
Take a closer look at the table; it only contains sample data for advertisement purposes. Column three combines specs for the Titan RTX and RX 5700 XT. The specs do not match either card. Additionally the same problem for column six. Six HBM2E stacks but a capacity of 16 - 32 GB. That's simply not possible and therefore the table data cannot be taken seriously and therefore it shouldn't be taken as a preview on Ampere. More likely the table contains example configurations and does not have the intention of representing exact card configurations. It should only demonstrate Micron's GDDR development.
Sorry, but I disagree. The table says Titan RTX and RX 5700 XT. Under that it lists number of placements (32-bit channels) as 12. That's correct for the Titan RTX, and obviously not the RX 5700 XT. It's far more likely that RX 5700 XT is mistakenly included than that the RTX 3090 data was wrong.

Ignore RX 5700 XT and you get Titan RTX. 12 16Gb chips, 384-bit memory, 672GBps of bandwidth. That makes for a nice comparison with RTX 3090, which has the same 12 chips (8Gb this time), 384-bit memory interface, and now 912-1008GBps of bandwidth depending on the clockspeed.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
The Micron PDF obviously confirms nothing!
Take a closer look at the table; it only contains sample data for advertisement purposes. Column three combines specs for the Titan RTX and RX 5700 XT. The specs do not match either card. Additionally the same problem for column six. Six HBM2E stacks but a capacity of 16 - 32 GB. That's simply not possible and therefore the table data cannot be taken seriously and therefore it shouldn't be taken as a preview on Ampere. More likely the table contains example configurations and does not have the intention of representing exact card configurations. It should only demonstrate Micron's GDDR development.
I agree with you. This chart isn't intended to show the specific memory configuration of any card. All Micron did was list each memory generation and then pick the fastest Nvidia and AMD card for each of those generations and then show potential specs for each memory type. The only thing this chart confirms is that there is GDDR6X and AMD won't be using it. Also, that the top of the line card will not be Titan and will be named something else. I think 2190 is more likely to 3090, but that's really inconsequential information.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
yup.

top tier gaming is getting to be an upperclass thing instead of middleclass.

hoping amd (or even intel) comes in next few yrs to give compentition so price drop ;/
Midrange cards of today are far faster than top of the line cards from years ago. Gaming doesn't require the top of the line card. Should companies only produce goods that you can afford?
 

vinay2070

Distinguished
Nov 27, 2011
255
58
18,870
Midrange cards of today are far faster than top of the line cards from years ago. Gaming doesn't require the top of the line card. Should companies only produce goods that you can afford?
Yesteryears had 1080P TVs and today most of the households buy 4K TV. Likewise, if you want to game on 1440P high refresh or game with all bells and wistles on 4K, you need a top end card. Even the 2080 ti struggles at 4k in certain games. So if a 3080 costs 799$+ its a big hard blow on the consumers wallet.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Yesteryears had 1080P TVs and today most of the households buy 4K TV.

July 2020 Steam hardware survey shows 2.23% of users are gaming at 4k. That's 1/4 the number of gamers still using 1366x768 as their primary display. 65.5% are still gaming at 1080p. What TV people have in their houses is irrelevant information.

Likewise, if you want to game on 1440P high refresh or game with all bells and wistles on 4K, you need a top end card. Even the 2080 ti struggles at 4k in certain games. So if a 3080 costs 799$+ its a big hard blow on the consumers wallet.

If you can afford a high end display, then you can afford a high end video card to drive it. Don't buy a Porsche then complain about the cost for the tires. There has never been a time when the top video card could crush every game with maxed out settings. The Crysis meme is 13 years old now. Back in the day, 60fps was the holy grail. There is no game that has ever been sold that requires 4k 144fps with every setting maxed to make it fun to play.
 

neojack

Honorable
Apr 4, 2019
611
177
11,140
This with the next AMD CPU ryzen 4990x or something.

then MAYBE No Man Sky VR could run in high quality without vomit inducing stutter.
maybe.

then i would need the same config for my wife too


well maybe 2nd hand in 2-3 years
 

vinay2070

Distinguished
Nov 27, 2011
255
58
18,870
July 2020 Steam hardware survey shows 2.23% of users are gaming at 4k. That's 1/4 the number of gamers still using 1366x768 as their primary display. 65.5% are still gaming at 1080p. What TV people have in their houses is irrelevant information.
TV was just an example. My emphasis was on High Hz 2K 3K displays.

If you can afford a high end display, then you can afford a high end video card to drive it. Don't buy a Porsche then complain about the cost for the tires. There has never been a time when the top video card could crush every game with maxed out settings. The Crysis meme is 13 years old now. Back in the day, 60fps was the holy grail. There is no game that has ever been sold that requires 4k 144fps with every setting maxed to make it fun to play.
You dont even need crysis in the scene, there are quite a few games that will max out a 2080ti at 4K 60Hz. Assuming next gen consoles being more powerful, it will only make 4K/3k/2K 144Hz more difficult to reach for games ported from consoles. If you include bad ports, it will make it worse. Also, QHD monitors and UWQHD monitors are no longer expensive. I can buy a 34" 144hz decent quality xiaomi monitor for 600 AUD on a frequent deal. And how much do I have to pay for the 2080Ti to max it out??? almost 2000 AUD. So no, your comparision of porsche and its tires is not right. There is a lot of competition in this segment. So I still feel 1200$ for a 2080ti flagship which was at best experimental with RTX was a huge price jump compared to the 700$ 1080ti and does not justify it. All it shows is the lack of competion from AMD and nothing more.
 

thGe17

Reputable
Sep 2, 2019
70
23
4,535
Sorry, but I disagree. ...

And still a Titan RTX with 12 GiB VRAM does not exists, in fact no TU102-based card with 12 GiB VRAM exists. And for the RX 5700 XT, the card has a 256 bit interface and it supports only 8 GiB (or 16 GiB). And as I already mentioned, also the last column is wrong.
And why should the RX 5700 XT be an error? The first column also combines the GTX 1070 and the Radeon RX 580. That's simply an example for the usage of Micron's memory.
If you want to see a leak regardless of the inaccurate data, be my guest. ;-)

Afaik, JEDEC has never published the spec for GDDR6X (which it did for GDDR5X), which tells me it's more of a marketing name this time around. It's simply faster GDDR6 on a better node.

That's correct, but that's no problem either. Manufacturers are sometimes ahead of spec oragnisations like JEDEC. For example it's the same with HBM2(E). Samsung and SK Hynix are already providing much faster stacks than the 3.2 Gbps, defined in current JESD235C (these are currently not in or are about to enter mass production).
And again HBM2E does not exit, the spec still only refers to HBM2 and there probably never will be something like "HBM2E" (officially). The name has been introduced by Samsung to accentuate their latest development (and SK Hynix took it up), therefore it's only marketing. ;-)
 
I'm suspecting the cost of this to be at or around the $2500 USD mark seeing that's the price point of their last flagship, the RTX Titan. I suppose we'll see but no matter what the cost, some people will be tripping all over themselves to buy it. Not because they actually need it... but just to say they have it more than anything.

I've got a solid 1080 and am soon to rebuild once the new AMD 4000 series drops. I'll think about a 3080 but I'm not going to make a jump until I see the usual gamut of tests from across the board. I'm interested to see if the rumored "Big Navi" cards from AMD can go close to straight up with these Ampere based offerings but at a lower cost of entry. We'll see soon enough.
 
  • Like
Reactions: JarredWaltonGPU

geogan

Distinguished
Jan 13, 2010
57
2
18,535
top tier gaming is getting to be an upperclass thing instead of middleclass.

Yep. Problems is, they are all being priced by and for Americans on 6-figure salaries, or people like NVidia boss on millions in bank.

For rest of world on low 5-figure salaries, they are just stupid prices.

Anyway, does anyone know if this kind of memory bandwidth is now limited by PCI-E 3 and can now benefit from increased bandwidth of PCI-E 4 as used in latest AMD X570 ?
 
Yep. Problems is, they are all being priced by and for Americans on 6-figure salaries, or people like NVidia boss on millions in bank.

For rest of world on low 5-figure salaries, they are just stupid prices.

Anyway, does anyone know if this kind of memory bandwidth is now limited by PCI-E 3 and can now benefit from increased bandwidth of PCI-E 4 as used in latest AMD X570 ?
Generally speaking, gaming consoles have the equivalent of a mid-range PC graphics card. For this coming generation, that means RTX 3060 / RX 6600 level most likely. Xbox Series X seems like it will land a bit higher up the tree (52 CUs should be high-end), but the top PC parts have always been a big step up in pricing for modest increases in performance.

RTX 3090, like the Titan cards, is not really meant to be a popular card. It's a tour de force from Nvidia, to maintain its lead as the fastest overall GPU. Because there's a lot of marketing value in being able to say you make the fastest graphics card. People might not be able to afford it, but they'll be more likely to buy the second or third tier GPU from the leader.

As for memory bandwidth, it has the opposite effect in regards to PCIe bandwidth. The more memory and memory bandwidth a card has, the less the PCIe bus matters. Right now, theoretically a card could load all the texture and world data into its VRAM in less than a second, and then not have to worry about PCIe bandwidth -- only 'new' data needs to go over the PCIe bus. Even PCIe Gen2 can mostly keep up with the draw calls and other stuff that keeps getting sent over the PCIe bus. But being PCIe Gen4 definitely won't hurt performance on the new Ampere and Navi 2x cards if you have an appropriate platform.
 
  • Like
Reactions: geogan