News RTX 4090 Can Run 'Genshin Impact' at 13K Resolution

ManDaddio

Reputable
Oct 23, 2019
99
59
4,610
I have been playing at 4k for several years now. I have both 4k and 1440p 144hz screens. There is a noticable difference. But viewers may not notice because of the quality of the display.

My 1440p screen has great colors and blacks versus my 4k TV. Therefore, the 1440p screen seems to look better. But when comparing similar quality screens the 4k will look better every time.

And most people are close to their screens so I am not sure about the comment referring to distance.

Size of screen matters, color quality matters, and resolution matters.

If you look at screen shots 4k will look 100% better. Video editors also will testify 1440p is not better than 4k when up or downscaling.
Context matters, perception and preconceived thinking does also as well as quality of hardware, etc.

I think some people crap on 4k because they want high refresh and/or can't afford a good 4k solution.

Yes, I said it. Get over it. 😄♥️
 
I play Waifu Impact at 1440p with my Vega64 and it looks quite good (hovers around 60FPS). I also play it on my phone (Samsung Note 10+) and it also looks quite good. Wherever you want to run this game, your Waifus are guaranteed to look good, so don't worry.

At over "8K" resolution, you just get more Waifu details and that is always welcome, just like you get tiny Waifus in your phone.

I did try Eyefinity with it, but not quite "13K" resolution. I haven't tried VR yet for it, but I'm sure the Waifus will still look great.

Regards :LOL:
 

brandonjclark

Distinguished
Dec 15, 2008
508
216
20,020
I have been playing at 4k for several years now. I have both 4k and 1440p 144hz screens. There is a noticable difference. But viewers may not notice because of the quality of the display.

My 1440p screen has great colors and blacks versus my 4k TV. Therefore, the 1440p screen seems to look better. But when comparing similar quality screens the 4k will look better every time.

And most people are close to their screens so I am not sure about the comment referring to distance.

Size of screen matters, color quality matters, and resolution matters.

If you look at screen shots 4k will look 100% better. Video editors also will testify 1440p is not better than 4k when up or downscaling.
Context matters, perception and preconceived thinking does also as well as quality of hardware, etc.

I think some people crap on 4k because they want high refresh and/or can't afford a good 4k solution.

Yes, I said it. Get over it. 😄♥


This is actually a fair take.

When I purchased my 3090 rig I was dead set on a quality 1440p solution because to me, refresh rate is king, even in non-competitive titles.

So, I agree.
 
D

Deleted member 1353997

Guest
I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.
This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen". All 3 screenshots were taken at maximum graphical settings, including AA. Without AA, the differences would be even more obvious. I recommend opening all 3 screenshots in a new tab and using CTRL+TAB to see the differences even more clearly.

1080p
HL2-1080p.png


1440p
HL2-1440p.png

Notice the increased detail in the fence and the antenna on the house.

2160p (4K)
HL2-2160p.png

Notice the increased detail on the crane (especially one of the 2 support cables is now much more clearly visible), the antenna has gained even more details, but the most obvious improvement is in the tree in the background on the left, where some of the branches are now being rendered, when it used to be invisible before. Fun fact: without AA, the mesh in the fence would be barely recognizable in 1080p and 1440p, and perfectly rendered in 4K. The tree in the foreground (at the top right) would also have holes in 1080p.

There's a reason why super resolution is a thing.
By the way, I originally wanted to take a screenshot at 3240p (5K Dynamic Super Resolution), but only the top left corner was visible (even in the screenshot), which wasn't very useful for comparing.

I mean, you guys are free to prioritise framerate over resolution, that's your choice. But to claim that there is no increase in (visible) detail is plain wrong. Have fun playing with your microscope, but stop assuming that everybody's as blind as you are.
 
  • Like
Reactions: BX4096 and 10tacle
1440p is the sweet spot for mostly everyone...up to a point.


say u have 2 same spec monitors 1 in 1440p and other in 4k.

if they are 30inches? you aint gonna notice much. (the more you try to cram into a small thing less discernible they become)

now if they are 40 inches?
you might notice because as size gets bigger the increased resolution becomes more noticeable .


but is ur an average joe with a 32" monitors...you likely dont even want 4k.
it pushes ur hardware harder for no real benefit while lowering ur frames. (and 1440p at 244hz is gonna feel/look better than a 120hz 4k at smaller screen size)

now if ur more into cinematic style games 4k likely more beneficial (as u'd want a brighter/larger screen and not care much about frames) but those type of games are minority.
 

ikernelpro4

Reputable
BANNED
Aug 4, 2018
162
69
4,670
I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

If you need a magnifying glass to perceive 8k, you probably need an electron microscope to see 13k.
The difference between 1440 and 2160 is substantial just in pixel count alone.

I would say that after 4k the differences become hard to tell.
 
This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen". All 3 screenshots were taken at maximum graphical settings, including AA. Without AA, the differences would be even more obvious. I recommend opening all 3 screenshots in a new tab and using CTRL+TAB to see the differences even more clearly.

1080p
HL2-1080p.png


1440p
HL2-1440p.png

Notice the increased detail in the fence and the antenna on the house.

2160p (4K)
HL2-2160p.png

Notice the increased detail on the crane (especially one of the 2 support cables is now much more clearly visible), the antenna has gained even more details, but the most obvious improvement is in the tree in the background on the left, where some of the branches are now being rendered, when it used to be invisible before. Fun fact: without AA, the mesh in the fence would be barely recognizable in 1080p and 1440p, and perfectly rendered in 4K. The tree in the foreground (at the top right) would also have holes in 1080p.

There's a reason why super resolution is a thing.
By the way, I originally wanted to take a screenshot at 3240p (5K Dynamic Super Resolution), but only the top left corner was visible (even in the screenshot), which wasn't very useful for comparing.

I mean, you guys are free to prioritise framerate over resolution, that's your choice. But to claim that there is no increase in (visible) detail is plain wrong. Have fun playing with your microscope, but stop assuming that everybody's as blind as you are.
All look the same to me, resolution only matters when you factor in distance. A 32” TV for example anything over 1080p at a normal viewing distance isn’t noticeable and that’s a physical limitation of the human eye.
 

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen".

What you're doing is downsampling. With downsampling you can overcome chroma subsampling for example, and turn a 4:2:2 image into a 4:4:4 image.

But that has nothing to do with being able to distinguish extra detail at higher resolutions. That extra detail per pixel only appears when you downsample.

There are several videos on Youtube where they test 1080p vs 4k screens at close viewing distances, and people had a really hard time telling the difference.

This idea that you can easily tell the difference is simply baloney imo. I think most people can not tell the difference from normal viewing distances.

Between 1440p and 4k, there's no way people can spot the difference from regular distances, it is incredibly hard to do at 1080p already.

First people said they could see the difference between 4k and 1080p, now you have people claiming they can tell 8k from 4k, soon you'll have people claiming they can see 16k. Either some people have become mutants with incredibly hawk-like vision, or they're trying to justify their $10,000 8k TV. I'm going with the later.

 
Last edited:

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
759
1,760
Btw, you would be right if you argued shooting footage at higher resolution and downsampling improves image quality btw, several cameras can do this. If you shoot 4k and (correctly) downsample to 1080p you can remove chroma subsampling that is usually present in 1080p footage.

Sony cameras shoot 4k footage in 6k and downsample it, and yes it's better than shooting in native 4k.

Some games do this successfully too to remove artefacts.

But this has nothing to do with the question if you can perceive the extra pixels/detial on a 4k screen or not. A 4k screen doesn't downsample to 1080p, 4k footage with chroma subsampling is still 4k footage with subsampling.

Downsampling also doesn't help all footage. It helps low quality images that have chroma or rendering issues. You're not going to improve the image quality from an Alexa cine camera much by downsampling it.

And lastly, you just need a 1080p screen to see the benefit of 4k downsampled footage, not a 4k one.
 
Last edited:

jkflipflop98

Distinguished
There is an obvious difference between 1440 and 4K. It's night and day. Anyone with a 4K monitor can go down to 1440 and see for themselves. It's like stepping back to 720p resolution. People spouting "there's no difference!" must be blind or have never actually seen a 4K display with their own eyes. More pixels = more data = more detail.
 

Exploding PSU

Honorable
Jul 17, 2018
461
147
10,870
I wonder if 13K resolution would ever become mainstream (heck, even 8K is already questionable). I still see 768p laptops being sold at the mainstream even today, many people I know (who isn't really that tech-oriented, which is FAIR I might add, not something to be ashamed of) still runs 1080p TV / displays no problem, and probably for the foreseeable future. 1440p and 4K is still honestly pretty niche outside the "tech circles" and "enthusiasts". Even plenty of high end phones reverted to 1080p-ish resolution these days.

Still a fun observation though.

I play Waifu Impact at 1440p with my Vega64 and it looks quite good.......

Waifus are guaranteed to look good, so don't worry.

At over "8K" resolution, you just get more Waifu details and that is always welcome, just like you get tiny Waifus in your phone.

I did try Eyefinity with it, but not quite "13K" resolution. I haven't tried VR yet for it, but I'm sure the Waifus will still look great.

Regards :LOL:

That game is the last game I'd expect to hear about on this site. That Vega64 have seen much more exciting things than my Vega 56.
 

Neilbob

Distinguished
Mar 31, 2014
203
238
19,620
This idea that you can easily tell the difference is simply baloney imo. I think most people can not tell the difference from normal viewing distances.

Between 1440p and 4k, there's no way people can spot the difference from regular distances, it is incredibly hard to do at 1080p already.

First people said they could see the difference between 4k and 1080p, now you have people claiming they can tell 8k from 4k, soon you'll have people claiming they can see 16k. Either some people have become mutants with incredibly hawk-like vision, or they're trying to justify their $10,000 8k TV. I'm going with the later.

I've said this time and time again: people engage in enormous amounts of confirmation bias to justify what they say with regards to pixel density/speed/insert whatever.

Unless you happen to have a screen that's in excess of 50 inches, and/or you sit a bit too close to it, there will be no easily discernible difference after 4k / 244hz. The human eye hasn't evolved to the stage where it can tell the difference, and isn't going to any time soon (unless we count those mutants). This is a process of millennia, not something that happens in 20 years. It's why I shake my head in exasperation at stories of forthcoming nonsense like 480hz+ (aimed at people taken in by marketing of course).

Display methods that are less conventional however, maybe there's a case there sometime in the future. I don't know what those methods might be but I don't mean VR, for which there are too many restrictions (in my opinion) to ever be really mainstream.

That's my obligatory bit of griping out of the way for today.
 

jp7189

Distinguished
Feb 21, 2012
332
189
18,860
There is a difference between 1440p and 4k when your doing "screen archery" and still frame analysis of very small details, but once I start actually playing the game all that melts away (for me), and the frame rate matters more. I had this thought that at 4k I would be able to better see far distant things (like enemies) and have an advantage that way, but no. Games cap render distance and then it's all static 2D billboards beyond that point.
 

BX4096

Reputable
Aug 9, 2020
167
312
4,960
I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

If you don't find them "useful", you may want to upgrade your tiny monitor to something larger. I'm on a 50" display, and let me assure you, I have no problem seeing a difference between 4K and 1440p even with my poor eyesight. No one would.

There is an obvious difference between 1440 and 4K. It's night and day. Anyone with a 4K monitor can go down to 1440 and see for themselves. It's like stepping back to 720p resolution. People spouting "there's no difference!" must be blind or have never actually seen a 4K display with their own eyes. More pixels = more data = more detail.

To be fair, the difference is not really that pronounced if you're sitting on a 24" monitor like any of these. It gets significantly more noticeable on larger monitor sizes.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

If you need a magnifying glass to perceive 8k, you probably need an electron microscope to see 13k.
The 4090 wasn't really designed for 1440 resolution at all. You can tell by how little the gain is on most games at that resolution. By little I mean in comparison to the gain at 4K resolution which is huge. It's truly designed for 4K gaming at Ultra settings. Trust me you can tell the difference between 1440 and 4K monitor resolution details.
 
D

Deleted member 431422

Guest
I've said this time and time again: people engage in enormous amounts of confirmation bias to justify what they say with regards to pixel density/speed/insert whatever.

Unless you happen to have a screen that's in excess of 50 inches, and/or you sit a bit too close to it, there will be no easily discernible difference after 4k / 244hz. The human eye hasn't evolved to the stage where it can tell the difference, and isn't going to any time soon (unless we count those mutants). This is a process of millennia, not something that happens in 20 years. It's why I shake my head in exasperation at stories of forthcoming nonsense like 480hz+ (aimed at people taken in by marketing of course).

Display methods that are less conventional however, maybe there's a case there sometime in the future. I don't know what those methods might be but I don't mean VR, for which there are too many restrictions (in my opinion) to ever be really mainstream.

That's my obligatory bit of griping out of the way for today.
A voice of reason. I don't reply to these threads anymore. There's no point. It's like talking to an audiophile who hears static in electric grid three stories bellow.
People believe what they want to believe, regardles of decades of research in the field.
 

edzieba

Distinguished
Jul 13, 2016
434
426
19,060
For reference, this is an upscaled image to 13K - since there are no 13K monitors on the market that we know of today.
I think you meant "downscaled from 13K (to whatever display resolution was used).
Rendering at 13K and downscaling to [display resolution] is technically impressive. Rendering at [display resolution] and upscaling to 13K is a trivial and pointless endeavour.
 
A voice of reason. I don't reply to these threads anymore. There's no point. It's like talking to an audiophile who hears static in electric grid three stories bellow.
People believe what they want to believe, regardles of decades of research in the field.
So you're saying you can't see the difference in 32" @ 1080p(native) and 32" @ 2160p(native) while sitting at a comfortable distance for gaming?
 
  • Like
Reactions: BX4096

BX4096

Reputable
Aug 9, 2020
167
312
4,960
So you're saying you can't see the difference in 32" @ 1080p(native) and 32" @ 2160p(native) while sitting at a comfortable distance for gaming?
If that's true, he may as well start learning Braille. No one can deny the difference in this particular case, except for people with truly bad vision or those who can never admit they are wrong.
 
D

Deleted member 1353997

Guest
But that has nothing to do with being able to distinguish extra detail at higher resolutions. That extra detail per pixel only appears when you downsample.
Your claim was, and I quote: "Nothing above [1440p] seems to actually add any detail unless you're sitting way too close to your screen"

If downsampling added visible detail, that detail would've still been visible on the source. It's not rocket science.