News Nvidia vs AMD vs Intel: Last 12 Months of GPUs Are Disappointing

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Eximo

Titan
Ambassador
Given the fact that there are 16 gig sharen in the console, most likely the distribution will fluctuate in 8 + 8 Gig, which does not look at all like a problem for the 4060, which has these 8 Gig. And we must also remember that the console must ensure the operation of a 4K TV, the 4060 is positioned as a video card for 1080p

They don't render most games at 4K either. They are only upscaled to 4K.
 
Most people don't view market segments by memory bandwidth, they view it by compute performance and then whatever may bottleneck it. The easily foreseeable future is higher texture resolutions readily blowing through 8GB and practically everyone who knows a thing about graphics quality vs compute effort will tell you that higher resolution texture are the lowest-hanging fruit on the visual quality tree. Cards that lack the bandwidth and space to accommodate that will hit brick walls beyond bad launch-month ports in the near future.

Again any settings that you would be using a 4060 at would not use more then 8GB of graphics memory. Once you've raised the relevant settings, mostly texture size, the card wouldn't be able to render a playable framerate regardless of the size of memory. nVidia is pushing DLSS3 precisely because of how low this cards relevant performance is. The 4060 really should of been called a 4050.
 
They don't render most games at 4K either. They are only upscaled to 4K.

Consoles generally render at 1080p then do a 2x HQ upscale to 2160p. Graphics memory is only useful for immediate scenes and the single largest consumer is stored textures. These weak GPU's don't really have the power to use 4096x4096 textures anyway, expect them to use 1024x1024, at most 2048x2048.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
Again any settings that you would be using a 4060 at would not use more then 8GB of graphics memory. Once you've raised the relevant settings, mostly texture size, the card wouldn't be able to render a playable framerate regardless of the size of memory.
Pushing higher texture resolutions has negligible impact on compute power requirement. It is an almost strictly memory size and bandwidth problem. of storing and accessing more mipmaps between the lowest and highest resolutions to more seamlessly accommodate all view distances. Like in Hogwart, the patch fixed the horrible stuttering when textures are being swapped into VRAM but paying close attention to textures, you can see things popping in and out of higher resolution textures when the GPU is low on memory. You may not notice it while moving but it can be jarring while still.
 
Pushing higher texture resolutions has negligible impact on compute power requirement. It is an almost strictly memory size and bandwidth problem. of storing and accessing more mipmaps between the lowest and highest resolutions to more seamlessly accommodate all view distances. Like in Hogwart, the patch fixed the horrible stuttering when textures are being swapped into VRAM but paying close attention to textures, you can see things popping in and out of higher resolution textures when the GPU is low on memory. You may not notice it while moving but it can be jarring while still.

Texture size doesn't impact compute requirements but resolution does and resolution has a heavy impact on how useful max texture size is. Rendering a 2048x2048 texture to a 1920x1080 screen is rather useless, and 4096x4096 is downright silly. Increase render resolution and we can increase texture size for meaningful gains, which heavily increases memory utilization. This is why 8GB on the 4060 is perfectly fine since the card mostly plays 1080p stuff with the occasional 1440p, to make the memory size a bottleneck we have to increase resolution which the 4060 simply doesn't have the compute power for.
 

InvalidError

Titan
Moderator
Texture size doesn't impact compute requirements but resolution does and resolution has a heavy impact on how useful max texture size is. Rendering a 2048x2048 texture to a 1920x1080 screen is rather useless, and 4096x4096 is downright silly.
Wrong: for textures to come out consistently cleanly after rotation and scaling, source resolution needs to be at least one notch higher than the render resolution. If you render at close to 1:1, there will be all sorts of shimmering, aliasing, blurriness, etc. depending on how the GPU's sampling pattern interacts with texture details.
 
Wrong: for textures to come out consistently cleanly after rotation and scaling, source resolution needs to be at least one notch higher than the render resolution. If you render at close to 1:1, there will be all sorts of shimmering, aliasing, blurriness, etc. depending on how the GPU's sampling pattern interacts with texture details.
I wouldn't say wrong so much as there are limited edge cases where a higher resolution texture can help. In practice, though, 1K textures at 1080p looks virtually indistinguishable from 2K textures at 1080p.

The bigger issue is that game engines don't always do the "smart" thing and so if you enable 2048x2048 textures (HD texture pack) at 1920x1080, even if those higher resolution textures are rarely used, they'll get loaded into memory. When that happens, you can end up with game engines that swap the 2K textures in and out of VRAM, trying to keep everything (that won't fit) loaded.

Basically, texture management is complex, and a lot of games don't do a great job at it. This is one of the cases where DX11 with the drivers doing the legwork is often better than DX12 with the game engine trying to optimize things.
 

InvalidError

Titan
Moderator
I wouldn't say wrong so much as there are limited edge cases where a higher resolution texture can help. In practice, though, 1K textures at 1080p looks virtually indistinguishable from 2K textures at 1080p.
The specific resolution of textures doesn't matter. What does is the scaling factor it gets rendered at on screen. If you want things to come out consistently clean, the texture resolution must always be higher than the resolution that texture gets presented at. In a game where your character's face may grind against walls, you may want those 2k resolution bricks even at 1080p, possibly higher depending on how intimate the game lets you get with the wall. How much resolution you need on a screw head texture? Depends on how close your viewport can get to that screw head, how much of your field of view that fills in the worst case and how many screen pixels that is. Same with everything else.

Of course, by the time you are that close to a wall, there is basically nothing else to render and it becomes a strictly VRAM size issue, which circles back to the argument of 8GB getting too tight for comfort.

Just about anyone who plays games should have come across many instances of things they wanted to take a closer look at only to realize that whatever those were, they were never meant to be looked at from this close. Higher resolution textures all-around would reduce that.
 
The specific resolution of textures doesn't matter. What does is the scaling factor it gets rendered at on screen. If you want things to come out consistently clean, the texture resolution must always be higher than the resolution that texture gets presented at. In a game where your character's face may grind against walls, you may want those 2k resolution bricks even at 1080p, possibly higher depending on how intimate the game lets you get with the wall. How much resolution you need on a screw head texture? Depends on how close your viewport can get to that screw head, how much of your field of view that fills in the worst case and how many screen pixels that is. Same with everything else.

Of course, by the time you are that close to a wall, there is basically nothing else to render and it becomes a strictly VRAM size issue, which circles back to the argument of 8GB getting too tight for comfort.

Just about anyone who plays games should have come across many instances of things they wanted to take a closer look at only to realize that whatever those were, they were never meant to be looked at from this close. Higher resolution textures all-around would reduce that.
Even if you get really close to a wall, to where a 2K or higher res texture might be useful, I dispute the claim that it would really look much better. There's only so much you can do with a flat polygon, and I've never played any game where I thought, "Man, the textures look fine but when I get too close to a wall, it looks bad." It's all relative. You might be able to make the wall look slightly better with a higher resolution texture, but not so much so that I think it's a critical factor.

Basically, DLSS, FSR2, and XeSS at 2X upscaling don't usually look much different from native rendering. And for textures, you get a similar 2X upscaling range (which is really 2X in each dimension, so 4X) of wiggle room. You can make a few things look better, sure, but it doesn't really matter in terms of making a game look significantly better.

Most textures, that are used in the final rendered output, are probably 256x256 and 512x512 mipmaps. 1K and 2K textures only matter (a bit) in select cases, mostly if you're playing at 4K resolution.
 

InvalidError

Titan
Moderator
You might be able to make the wall look slightly better with a higher resolution texture, but not so much so that I think it's a critical factor.
IMO, there hasn't been such a thing as a "critical factor" in game visual quality in ~10 years, all incremental improvements that some may care more about than others.

Another example of where higher resolution textures would help is floors: many games don't intend for you to look at or close to your character's feet and notice how low-res the floor actually is. Still kind of immersion-breaking and what has been seen cannot be unseen.
 
Aug 31, 2023
1
0
10
ryzen-mpc-meme.jpg
 
Status
Not open for further replies.