Review Nvidia GeForce RTX 4070 Review: Mainstream Ada Arrives

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

kal326

Distinguished
Dec 31, 2007
1,229
108
20,120
Are we going to hit a point where APUs are the new norm for gaming? Is this leaving the dedicated market for the insanely rich only in a few generations time? For the 'portable' market that is great news but for everyone else in PC land...not so much.
Given that a 1060 just finally got bumped from the lead of the Steam hardware survey should be a pretty good indication of the sad state of the PC gaming landscape as a whole. The top 5 as of March 2023 are 3060, 2060, 1060, 3070, 3060 Ti with a 10.44% to 4.95% share. The highest 40 series cards are the 4090 at 0.25%, the 4070 Ti at 0.23%, and the 4080 at 0.19%. Also the primary single monitor resolution is 1080P.

The historically highest priced card of that bunch is the 3070, but the 3060 Ti wasn't far off. The lack of a sub $500 card in any sort of availability from either side is probably really what is going to kill PC gaming. RX 6650XT cards are just now hitting sub $300 and are good for 1080P, but not much more than 70-80FPS. Followed by the fact that we are 6 months off of the launch of the 4090 and 4ish months off the launch of the RX 7900 and actual entry level cards are still no where to be seen makes it seem like both sides are really just trying to milk the market.
 
D

Deleted member 2838871

Guest
Given that a 1060 just finally got bumped from the lead of the Steam hardware survey should be a pretty good indication of the sad state of the PC gaming landscape as a whole. The top 5 as of March 2023 are 3060, 2060, 1060, 3070, 3060 Ti with a 10.44% to 4.95% share. The highest 40 series cards are the 4090 at 0.25%, the 4070 Ti at 0.23%, and the 4080 at 0.19%. Also the primary single monitor resolution is 1080P.

The fact that 1080p is the primary resolution explains a lot. I most certainly would not have paid $1750 for a 4090 to game in 1080p... any of those low end cards could do that.

4K Ultra gaming though? 4090 without question... especially given it's such a huge (50%) performance upgrade over the 3090.
 
  • Like
Reactions: atomicWAR
@JarredWaltonGPU thanks for the review, I saw that you retested all the GPUs with recent drivers which must have taken a huge effort (y)
PS: Are you planning to add or replace some older titles with new games for future benchmarks?
Always looking, but I need good tests — good as in not hard to run, preferably with a built-in benchmark, and also popular and look nice. I don't want to add things that are too new, as performance can change way too much with patches, but everything older than two years could use a more modern replacement. So I am up for ditching Borderlands 3, Bright Memory Infinite (benchmark), Control, Forza Horizon 5, Horizon Zero Dawn, Metro Exodus Enhanced, Red Dead Redemption 2, Total War: Warhammer, and Watch Dogs Legion. :D

That's nearly my whole gaming test suite, but in recent months I haven't noticed anything terribly exciting that I'd want to put into the rotation. Basically the only stuff I like and want to keep right now is Cyberpunk 2077, Far Cry 6, Flight Simulator, Minecraft, A Plague Tale: Requiem, and Spider-Man: Miles Morales. Not because they're the best tests and games ever, but just because they're relatively recent (or have been updated recently) and tend to be reasonably painless to test. Cyberpunk 2077 actually just added a new option to allow me to retest without having to go back to the main menu, and you wouldn't believe how excited and happy that made me. LOL

Anyway, if you have any suggestions (with a major criteria being that the game is relatively demanding — I don't intend to add CSGO to my list, for example), let me know.
 
This card almost certainly costs less than the RTX 3070 to make...
You say that based on what exactly? TSMC 4N at 294.5mm^2 versus Samsung 8N at 392.5mm^2, 50% more memory with significantly higher clocks, R&D costs... Given most people say TSMN N4 costs at least twice as much per mm^2 as Samsung 8N, that basically means the AD104 is equivalent to a ~600mm^2 8N part. So by that metric, the card probably costs almost the same to make as the RTX 3080. Well, maybe. PCB costs went down (192-bit interface), and a 200W TGP means it's cheaper to make than a 320W TGP board.

But my point is that I would not confidently assert that "it almost certainly costs less than a 3070 to make." On the contrary, given we know that TSMC had a ton of demand for 5nm-class manufacturing last year and a lot of companies had to "overpay" for capacity, I'd wager it's more likely that it costs slightly more to make than its predecessor. Then factor in inflation plus economic conditions and $599 feels about right. I don't love the price, but I can't really get angry over it for what is mostly a luxury item.

I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.
 

atomicWAR

Glorious
Ambassador
The fact that 1080p is the primary resolution explains a lot. I most certainly would not have paid $1750 for a 4090 to game in 1080p... any of those low end cards could do that.

4K Ultra gaming though? 4090 without question... especially given it's such a huge (50%) performance upgrade over the 3090.

<----games at 4K144hz

Yeah the 4090 is no question the 4K high refresh rate king at the moment. And I ended up being one of those suckers as I was coming from my now 4K vram 'anemic' RTX 2080Ti with 11GB (4K was trouble...though even 1440P could have trouble with games that launched this year like RES 4 where your really need 12GB). Sadly its just more proof Nvidia tends to short cards on vram as 2-2.5 gens shouldn't be the expiration date for your VRAM. It should be closer to a full three generations before a high end cards VRAM is a problem IMHO.

Anyways between the cost of the 4080 being so high compared to its performance differential to the 4090 and its vram being lighter than it should be (20-24GB IMO) for a second tier card made it very unappealing option for me. Where as the 4090 should have more like 28-32GB of ram...regardless pricing this gen made the 4090 a no brainer for 4K high refresh rate. Sadly the 4080 priced itself into oblivion as the 80 series is historically the tier card I WOULD have purchased for my rig but the price jump this gen has either upsold folks to 90 class, down sold folks to 70Ti class, or drove people to remaining stock of last gen cards/ to the used market.
 
D

Deleted member 2838871

Guest
<----games at 4K144hz

Yeah the 4090 is no question the 4K high refresh rate king at the moment.

Hahah! You just reminded me! I've been gaming at 4K 60 (and it's amazing looking in games like Hogwarts and Last of Us) but I really need to try 4K 120!

https://www.rtings.com/monitor/reviews/lg/48-cx-oled

My OLED display supports it... but honestly I haven't tried it yet with the new 4090 upgrade. I totally forgot because the 3090 couldn't do it and I never tried again. :LOL:
 
Last edited by a moderator:

atomicWAR

Glorious
Ambassador
I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.

I hear that argument and truly see where your coming from but I feel like Nvidia/AMD owed gamers some good will after the crypto boom/covid supply issues/scalp-a-thon we were greeted with last gen. Especially considering what Jensen just said about crypto adding nothing useful to society yet being all to happy to cash in when it was convenient for their bottom line. Anyways it felt very insulting as a long time user of Nvidia's products. I also know I am not alone nor even in the minority for that manner.

Had Nvidia either priced the 4080 lower than they did. Even if it was 949 and nvidia dropped a TI sku with a decreased performance delta by half compared to the 4090 vs 4080 while marketing it as that 1200 US msrp card... and also decreased the RTX 4070 Ti msrp by another 50 dollars more to 749.99 and a lot of gamers could have swallowed this 4000 series pricing by Nvidia (ie 599 RTX 4070s to jump back on topic) and mostly considered themselves victims of inflation. Sadly that just not how this gen played out and accordlingly we are seeing a lot of fairly rightous outrage in the forums over Nvidia's and AMD's pricing behavior this gen and as a whole lately (post RTX).
 
Last edited:
  • Like
Reactions: SydB

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
Always looking, but I need good tests — good as in not hard to run, preferably with a built-in benchmark, and also popular and look nice. I don't want to add things that are too new, as performance can change way too much with patches, but everything older than two years could use a more modern replacement. So I am up for ditching Borderlands 3, Bright Memory Infinite (benchmark), Control, Forza Horizon 5, Horizon Zero Dawn, Metro Exodus Enhanced, Red Dead Redemption 2, Total War: Warhammer, and Watch Dogs Legion. :D

That's nearly my whole gaming test suite, but in recent months I haven't noticed anything terribly exciting that I'd want to put into the rotation. Basically the only stuff I like and want to keep right now is Cyberpunk 2077, Far Cry 6, Flight Simulator, Minecraft, A Plague Tale: Requiem, and Spider-Man: Miles Morales. Not because they're the best tests and games ever, but just because they're relatively recent (or have been updated recently) and tend to be reasonably painless to test. Cyberpunk 2077 actually just added a new option to allow me to retest without having to go back to the main menu, and you wouldn't believe how excited and happy that made me. LOL

Anyway, if you have any suggestions (with a major criteria being that the game is relatively demanding — I don't intend to add CSGO to my list, for example), let me know.
Appreciate your reply Jarred, here are some off the top of my head:
Dying Light 2
Elden Ring
RE Village
God Of War
Callisto Protocol
Dead Space
COD MW2
Hogwarts Legacy
RE 4 Remake (I know, too recent!)

Not all of them have built-in benchmarks but IMO they all have gorgeous looking graphics.
 
"Not to beat a dead horse, but while DP1.4a is technically worse than HDMI 2.1 and DisplayPort 2.1 (54 Gbps), we have tested a bunch of GPUs with a Samsung Odyssey Neo G8 that supports up to 4K and 240 Hz, thanks to Display Stream Compression (DSC). Until we start seeing DisplayPort 2.1 monitors that run at 4K and 480Hz, DisplayPort 1.4a should suffice. "

This right here is a deal breaker for me and the only reason I'm not entertaining the 4000-series. Let me rephrase that paragraph: "Its just good enough for yesterdays tech, and with some trickery in video compression it will suffice for todays technology, but its utterly incapable of tomorrows tech and a very very poor investment for the future." This is backwards thinking, and with the high price tag, NOPE!
Here's the funny thing: Name me some monitors that you know support DSC. It has been around for a while, yes, but it's actually hasn't been used much. This Samsung Odyssey Neo G8 is one of the first I've encountered with DSC support. So I guess anything that has existed as a standard and not been used until recently is "yesterday's technology." We could also rephrase your rephrasing to this: "I really need to buy 'future proof' hardware with support for standards that barely exist, in the hope that some day over the 3-year life of this product I'll actually be able to use that tech!" By the time monitors and displays that can actually benefit from DP2.1 are available at not-insane prices, we'll be on to the next generation GPUs, and I'm positive Nvidia will finally leave DP1.4 behind with Blackwell. (Though I was almost certain that would happen with Ada and that ended up being wrong...)

I'm not opposed to DisplayPort 2.1 support in AMD, but it's truly not a big deal. We only have a very limited number of monitors that just came out in the past six months that approach the limits of DisplayPort 1.4a with DSC (as in, 4K and 240Hz). If 8K actually takes off (doubtful in the next five years), maybe we'll see displays that truly need DP2.1 bandwidth — 8K and 120Hz. Now read that again. 8K. 120 Hz.

Do you know how much high fps 8K content there is? Practically none, and what there is exists mainly for marketing purposes. 8K and 60 fps is barely a thing, and it's already supported with DP1.4a. 8K and 120 fps is the practical limit of what we might ever really need, because while we could reasonably do 120Hz videos and gaming at some point, it's a massive stretch. Gaming at 8K and 120Hz? That's 4X the pixels of 4K, and even the fastest GPUs, using Frame Generation, rarely get above 120Hz at 4K.

8K is basically a living room standard for placebo effect. 8K for movie theaters with 80 foot screens, sure, but on a 120-inch projection viewed from 20 feet away? Your eyes literally can't see the difference. 8K on a screen that's three feet away? Sounds like great marketing material. We could call it the "Retina 8K display" and get all the Apple users to buy it, and then run at 200% scaling. "OMG the pixels are so small that I can't see than and need a magnifier!" That's already basically true of 4K on a 27-inch monitor, like the one I'm using right now (and have been using for eight years).

Calling DSC "trickery in video compression" could apply to every video codec ever released. Well, except DSC isn't a codec, it's a very high throughput algorithm that, depending on the screen content, can deliver either lossless compression (on a lot of stuff) or it can do up to 3:1 compression with minimal artifacting. And I really do mean that 'minimal' bit. Good luck spotting it. It's not at all like 4:2:2 or 4:2:0 "compression" where the result is lossy and you might actually notice (maybe, depending on the implementation — RTX GPUs and RX 5000 and later look fine; GTX 10-series not so much). The only real problem with DSC is that if you get any signal corruption, like even a single flipped bit, it can seriously screw up transmission. That's probably why it hasn't been used until now.
 
  • Like
Reactions: hotaru.hino
You say that based on what exactly? TSMC 4N at 294.5mm^2 versus Samsung 8N at 392.5mm^2, 50% more memory with significantly higher clocks, R&D costs... Given most people say TSMN N4 costs at least twice as much per mm^2 as Samsung 8N, that basically means the AD104 is equivalent to a ~600mm^2 8N part. So by that metric, the card probably costs almost the same to make as the RTX 3080. Well, maybe. PCB costs went down (192-bit interface), and a 200W TGP means it's cheaper to make than a 320W TGP board.

But my point is that I would not confidently assert that "it almost certainly costs less than a 3070 to make." On the contrary, given we know that TSMC had a ton of demand for 5nm-class manufacturing last year and a lot of companies had to "overpay" for capacity, I'd wager it's more likely that it costs slightly more to make than its predecessor. Then factor in inflation plus economic conditions and $599 feels about right. I don't love the price, but I can't really get angry over it for what is mostly a luxury item.

I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.
You're starting out with an extremely flawed premise as you're ignoring the basic reality of fabrication. They're going to be getting roughly 30% more chips per wafer based on die size alone. That means even if the price is double per wafer, which it may be, they're paying around 50% more per chip. This is all assuming that yields are the same which may or may not be the case since these chips are still smaller. The PCB size is smaller, requires less power delivery and has fewer memory chips to deal with as these chips will be 2gb instead of 1gb. The memory market is heavily depressed and I'd be shocked if they were paying more for this 12gb than the 8gb from the 3070.

It's absolutely okay to call out bad pricing which is simply designed to maintain/grow already high margins "luxury" item or not.
 
Last edited:
  • Like
Reactions: LolaGT
Appreciate your reply Jarred, here are some off the top of my head:
Dying Light 2
Elden Ring
RE Village
God Of War
Callisto Protocol
Dead Space
COD MW2
Hogwarts Legacy
RE 4 Remake (I know, too recent!)

Not all of them have built-in benchmarks but IMO they all have gorgeous looking graphics.
I've actually looked at a bunch of those. Unfortunately, I had issues with almost all of them.

Dying Light 2: The day/night cycle and dynamic weather, etc. makes this extremely irritating to test. Performance can vary by up to 15% or more on the same GPU between runs, if one is say at 8am and another is at 8pm. Pass. I wish there were a built-in benchmark, as I would probably add it to the RT list otherwise!

Elden Ring: Actually one of the few games I haven't even tried. I know, the Dark Souls people love this sort of game, but unless I'm mistaken there's no built-in benchmark here either.

RE Village: I wasn't super impressed with the visuals. They're good, but not amazing. The RT effects are very lackluster, though, and no built-in benchmark when I looked.

God Of War: I did some preliminary testing, but didn't find much reason to keep this around. FSR support was the biggest draw, and again, no built-in benchmark. Sigh.

Callisto Protocol: Seems to be one of those games with artificially high VRAM requirements, and reviews were mediocre. There is a built-in benchmark at least, but I don't think many people are playing this one. I've poked at it a bit but never tried to collect more than a couple of test runs. I also played with the ray tracing settings and found they didn't seem to help visuals much (which has been a recurring problem with AMD-promoted games and RT: make it run faster with RT enabled often means making the RT not do much).

Dead Space (remake): Haven't tried this one at all, but I don't usually give much time to remakes/remasters.

COD MW2: No benchmark mode, getting repeatable runs takes more effort. I've already got four games I have to manually test, and they're a thorn in my side. LOL

Hogwarts Legacy: I poked at this, and the game has been super glitchy. I was actually just playing it "for fun" and then at one point I couldn't even get the game to load properly for like three weeks! That was enough for me to write it off as a potential benchmark candidate, and the lack of a built-in test sealed the deal.

RE4 Remake (I know, too recent!): Yeah, this is too new. If I had a clone or two, I would have made him run some launch benchmarks! :-D At least it's not one of those "look at the crappy port" games, but again I don't normally pay much heed to performance testing on remakes.

If any of those have added a benchmark mode that I'm not aware of, let me know! Callisto is probably the closest to something I'd consider adding, but probably it would be in the non-DXR section unless I've missed something.
 
  • Like
Reactions: Elusive Ruse
D

Deleted member 2838871

Guest
Hogwarts Legacy: I poked at this, and the game has been super glitchy. I was actually just playing it "for fun" and then at one point I couldn't even get the game to load properly for like three weeks! That was enough for me to write it off as a potential benchmark candidate, and the lack of a built-in test sealed the deal.

That's too bad... I just started it this week and have had no performance issues. The game looks amazing in 4K Ultra.
 
You're starting out with an extremely flawed premise as you're ignoring the basic reality of fabrication. They're going to be getting roughly 30% more chips per wafer based on die size alone. That means even if the price is double per wafer, which it may be, they're paying around 50% more per chip. This is all assuming that yields are the same which may or may not be the case since these chips are still smaller. The PCB size is smaller, requires less power delivery, has fewer memory chips to deal with as these chips will be 2gb instead of 1gb and there's the lower bus bitrate. The memory market is heavily depressed and I'd be shocked if they were paying more for this 12gb than the 8gb from the 3070.

It's absolutely okay to call out bad pricing which is simply designed to maintain or grow margins "luxury" item or not.
And you're starting out with an extremely flawed premise that Nvidia shouldn't try to make a profit. It's like the people saying, "Nvidia and AMD made a lot of money over the past two years from crypto and they should pass along savings to their customers." That will literally never happen. The shareholders and board of directors would nix that idea if it was ever presented.

Is the $599 price awesome? No, absolutely not. Is it terrible, though? Also absolutely not. It feels like a price that Nvidia can justify, and equally likely is that the market will support that price. I mean, it's better than any other 40-series so far in terms of value (depending on how you want to define that). Even if the costs for RTX 4070 were lower than the 3070, it would still be bad business to cut prices if you don't have to. Actually, what would happen most likely is that Nvidia would set a lower price and then retailers and AIBs would just mark it back up, so Nvidia would lose twice.
 
D

Deleted member 2838871

Guest
And you're starting out with an extremely flawed premise that Nvidia shouldn't try to make a profit. It's like the people saying, "Nvidia and AMD made a lot of money over the past two years from crypto and they should pass along savings to their customers." That will literally never happen. The shareholders and board of directors would nix that idea if it was ever presented.

It's funny actually because the same people complaining about GPU prices are the same people that drop $1400 on an iPhone without batting an eye. :LOL:
 
Last edited by a moderator:
And you're starting out with an extremely flawed premise that Nvidia shouldn't try to make a profit. It's like the people saying, "Nvidia and AMD made a lot of money over the past two years from crypto and they should pass along savings to their customers." That will literally never happen. The shareholders and board of directors would nix that idea if it was ever presented.

Is the $599 price awesome? No, absolutely not. Is it terrible, though? Also absolutely not. It feels like a price that Nvidia can justify, and equally likely is that the market will support that price. I mean, it's better than any other 40-series so far in terms of value (depending on how you want to define that). Even if the costs for RTX 4070 were lower than the 3070, it would still be bad business to cut prices if you don't have to. Actually, what would happen most likely is that Nvidia would set a lower price and then retailers and AIBs would just mark it back up, so Nvidia would lose twice.
I'm not saying they shouldn't try to make a profit, but rather I'm saying you shouldn't defend this practice of ever increasing margins by saying the price is okay. All of the 40 series (aside from the 4090 as halo products have always had stupid prices whether justified or not) have seemingly been priced to make up for lack of crypto era volume. Apple did this with the iphone when sales started to dip and they just increased their margins because they could get away with it. So many markets are doing this and not enough people with an audience seem to be willing to call them out on it.
 
Last edited:
D

Deleted member 2838871

Guest
I'm not saying they shouldn't try to make a profit, but rather I'm saying you shouldn't defend this practice of ever increasing margins by saying the price is okay. All of the 40 series (aside from the 4090 as halo products have always had stupid prices whether justified or not) have seemingly been priced to make up for lack of crypto era volume. Apple did this with the iphone when sales started to dip and they just increased their margins because they could get away with it. So many markets are doing this and not enough people with an audience seem to be willing to call them out on it.

I would if the other guy made a better product.

My first Nvidia card was the GeForce 3 in 2001... and I've never owned an AMD card because IMO they weren't better... and still aren't. AMD CPUs I've had a few... but never a GPU.

Same can be said for Apple. Not a fan of their prices either... but the ecosystem meshes very well together. The last Android phone I had was the Galaxy S2... in like 2010 or something. Never had a reason to drop the iPhone (or iPad) from my device list.

Been with Verizon since my Nokia phone in 1998... because... yep... you guessed it... they have the best network and I've never had a reason to switch.
 
  • Like
Reactions: JarredWaltonGPU
I'm not saying they shouldn't try to make a profit, but rather I'm saying you shouldn't defend this practice of ever increasing margins by saying the price is okay. All of the 40 series (aside from the 4090 as halo products have always had stupid prices whether justified or not) have seemingly been priced to make up for lack of crypto era volume. Apple did this with the iphone when sales started to dip and they just increased their margins because they could get away with it. So many markets are doing this and not enough people with an audience seem to be willing to call them out on it.
And I'm saying we don't have enough data to clearly determine whether there's really "ever increasing margins." Based on what's happening in the PC market as a whole, I can unequivocally state that Nvidia is not making nearly as much money right now as it was over the past couple of years. Neither are AMD, Intel, Apple, Dell, etc. In fact, profits are looking rather catastrophic in the currently collapsing global economy. Nvidia has reportedly cut TSMC orders because it doesn't see the demand it had hoped for (though some of the orders are likely just being changed to Hopper H100, because the demand for AI is still there and Nvidia can make a ton of money selling $25,000+ H100 cards).

Markets can be cyclical and I get that. Right now everything is trending down, and when you're not selling as many parts, that means every part you do sell has to shoulder a larger percentage of your R&D and other costs. We'll see what things look like in Nvidia's next earnings report. I don't think the company is doing poorly, but it's definitely doing worse right now than it has in the past few years. We already saw that with the February financials. The past two or three quarters are down to about half the revenue as the previous year or so. Goodbye, crypto mining profits! LOL

I think Nvidia is being smart right now. There's a downturn in gaming revenues (crypto), plus other factors, and the past two quarters have been quite poor compared to previous quarters. More importantly, I think Nvidia expects things to continue to be bad for at least several more quarters. Rather than overproducing (it still has a bunch of Ampere GPUs to sell I'd wager), it's keeping prices as high as it can and preparing to weather the storm. AMD is doing the exact same thing. It's why we still don't have anything below the RX 7900 XT from AMD on desktop.
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
I've actually looked at a bunch of those. Unfortunately, I had issues with almost all of them.

Dying Light 2: The day/night cycle and dynamic weather, etc. makes this extremely irritating to test. Performance can vary by up to 15% or more on the same GPU between runs, if one is say at 8am and another is at 8pm. Pass. I wish there were a built-in benchmark, as I would probably add it to the RT list otherwise!

Elden Ring: Actually one of the few games I haven't even tried. I know, the Dark Souls people love this sort of game, but unless I'm mistaken there's no built-in benchmark here either.

RE Village: I wasn't super impressed with the visuals. They're good, but not amazing. The RT effects are very lackluster, though, and no built-in benchmark when I looked.

God Of War: I did some preliminary testing, but didn't find much reason to keep this around. FSR support was the biggest draw, and again, no built-in benchmark. Sigh.

Callisto Protocol: Seems to be one of those games with artificially high VRAM requirements, and reviews were mediocre. There is a built-in benchmark at least, but I don't think many people are playing this one. I've poked at it a bit but never tried to collect more than a couple of test runs. I also played with the ray tracing settings and found they didn't seem to help visuals much (which has been a recurring problem with AMD-promoted games and RT: make it run faster with RT enabled often means making the RT not do much).

Dead Space (remake): Haven't tried this one at all, but I don't usually give much time to remakes/remasters.

COD MW2: No benchmark mode, getting repeatable runs takes more effort. I've already got four games I have to manually test, and they're a thorn in my side. LOL

Hogwarts Legacy: I poked at this, and the game has been super glitchy. I was actually just playing it "for fun" and then at one point I couldn't even get the game to load properly for like three weeks! That was enough for me to write it off as a potential benchmark candidate, and the lack of a built-in test sealed the deal.

RE4 Remake (I know, too recent!): Yeah, this is too new. If I had a clone or two, I would have made him run some launch benchmarks! :-D At least it's not one of those "look at the crappy port" games, but again I don't normally pay much heed to performance testing on remakes.

If any of those have added a benchmark mode that I'm not aware of, let me know! Callisto is probably the closest to something I'd consider adding, but probably it would be in the non-DXR section unless I've missed something.
Thanks for the in-depth reply, Jarred, as long as a few newer entries which would punish these new GPUs a bit get added I'm happy.
 

thisisaname

Distinguished
Feb 6, 2009
807
442
19,260
You say that based on what exactly? TSMC 4N at 294.5mm^2 versus Samsung 8N at 392.5mm^2, 50% more memory with significantly higher clocks, R&D costs... Given most people say TSMN N4 costs at least twice as much per mm^2 as Samsung 8N, that basically means the AD104 is equivalent to a ~600mm^2 8N part. So by that metric, the card probably costs almost the same to make as the RTX 3080. Well, maybe. PCB costs went down (192-bit interface), and a 200W TGP means it's cheaper to make than a 320W TGP board.

But my point is that I would not confidently assert that "it almost certainly costs less than a 3070 to make." On the contrary, given we know that TSMC had a ton of demand for 5nm-class manufacturing last year and a lot of companies had to "overpay" for capacity, I'd wager it's more likely that it costs slightly more to make than its predecessor. Then factor in inflation plus economic conditions and $599 feels about right. I don't love the price, but I can't really get angry over it for what is mostly a luxury item.

I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.

"Almost feels like a good deal" but in the end not :(