AMD Shares More Details On The Radeon VII

  • Thread starter Thread starter Guest
  • Start date Start date
"The AMD Radeon VII comes equipped with 60 compute units, and since the graphics card adheres to the GCN standard, this tallies up to 3,840 stream processors, 240 TMUs (texture mapping units), and 64 ROPs (render output units)"


I thought it had 128 ROP's?
 


That's quite a bit, certainly... most homebuilts won't have an issue with that if built properly.


Now, I realize that there was no Ray Tracing involved, and the titles/apps used were probably cherry picked.... but what I could gather with the data provided... it can outperform the RTX2080... It's nice to see that... but as pointed out... 295W is the cost outside of initial investment. I'm sure we'll know more once we get to see actual reviews.
 
hm if it does 4k better without ray tracing, ill prob get it over the 2070 or 2080. i don't much care for the performance hit ray tracing causes, plus i do a lil editing on the side, seems that this will probably be better for me.
 
Ray tracing doesn't matter yet.... But at only a $50 price Delta (if you catch a 2080 on sale) i'd honestly go Nvidia. I hope 3rd party Vega 2's are cheaper. Especially if the TDP is 295w on 7nm vs 250w on a 2080
 


You are likely spot on however we probably should wait to see test results. Personally I'm waiting for Navi and Nvidia's 7nm refresh before my next GPU purchase, so I can decide between those two products.
 


I never like company provided results. Mainly because if you look each game uses a different API and resolution. It smells like cherry picking to me.

We will see though once it is benchmarked by third party sources using common games instead of examples that show superiority.
 
I'm a fan of AMD, but at a higher TDP, same price and performance to one of the worst value cards in Nvidia's history, this card isn't competitive. In 5 days AMD will no longer have the Freesync advantage, making them compete on performance, value and efficiency alone. They lose at least in one of those and gain in no other categories. And again, that's against a card that Nvidia is price gouging on with little performance gain over last gen. Plus Nvidia adds RTX to the table, no matter how small of an added value that is for the prospective buyer. Add the fact that even if all things were equal, most people would rather pick up an Nvidia card, and AMD will have a hard time selling this card.

I feel like they launched this card just so AMD can say they have a high performance/4K GPU after all as well, while it wasn't possible for them to price it properly while still making a profit due to this card being expensive to make at TSMC's 7nm. It's one of those "barely worth it but might as well release it just so it's there" kind of products that they pushed mainly to uphold their rep. It was kind of silly that a company priding themselves as a GPU company (pushing GPUs to all high-performance console vendors) had no 4K-capable GPU available. This fixes that, but the card is a very tough sell. I was waiting for a flagship card from AMD, but I won't be getting this one. Feels also like an opportunistic, unplanned launch that happened purely due to the unexpectedly poor current Nvidia line-up.

Paradoxically, it might be a good deal for the computing and productivity folk. Getting a high performance GPU for $699 that's not far off from professional cards might be a steal. Indie devs and start-up studios might go for it. Especially if they ALSO game. But as a pure gaming card, this isn't a good value.
 




250W TDP on 2080 is closer to 350W when OC'd to the max (Ref design and non-reference would take that 300W while stock), and If Radeon VII is anything like 1st gen Vega, then It'll OC while undervolted and would be around the same TDP as ref design 2080
 
I think you're right here.
Rumors are:
1. The only reason this card exist is that Navi is delayed and this is what was feasible to get timely to the market.
2. Early estimate of the production cost for this card said $750. Expect each GPU to be sold at a loss for AMD.
 
I think at this point it comes down to what you value more, double the Vram or Ray Tracing... I personally am going with double the Vram. Digital Foundry's dive into the Resident Evil 2 demo showed that at 4k max settings the game can use up 10gb of vram. This is a close quarters game, imagine the open world games to come in the next 3 to 4 years. I'll take 4k with ultra textures and a little AA on top over shiny reflections any day.