Linus Torvalds: 2560x1600 Needs To Be Next Standard

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

scooterboi

Distinguished
Nov 2, 2010
24
0
18,510
Apple released very high res display. Tom's comment section: Meh, Pointless, who the F can read those texts, HAHAHA the beginning of apple's downfall, enjoy 5 fps in anything appledrone LOLOL.

Torvalds pushing for very hi-res display to be the new standard. Tom's comment section: FABULOUS, fantastic!!, oh god I've dreamed about this for sooo long, now that's what i'm talking about, totally agree with Linus he's my prophet anyway.

Hypocrites at its finest.
 

guvnaguy

Honorable
Oct 27, 2012
74
0
10,630
Graphics cards (especially laptops) aren't ready for 1600p or greater, yet, unless you get the most powerful/expensive solutions, so it would kind of be a step back for gaming IMO. Sure, text and images look great, but then you launch a game only to be forced to play it at 1080 or lower (which will look fuzzy after having just been at 2560x1600) because the video card can't handle it.

Also, movies aren't really ready yet. 4K isn't standard yet, nor a disc format dense enough to hold the many gigs of data. I've watched movies on the Macbook Retina and they don't look great. Upscaling doesn't work when you're having to make up 2 or 3 times as many pixels are there are recorded.

TL;DR Are we ready for higher res? Gaming hardware isn't. movie/TV content isn't.
 

icehot

Distinguished
Feb 18, 2007
27
0
18,540
Hrm not sure I agree with him, I have a 1080p laptop on a 17" screen, and I think that's a pretty decent resolution at that screen size, otherwise everything would be so small to look at and read it'd be a strain - though I agree on something like a 24" screen that could easily be a good resolution.
 
[citation][nom]alidan[/nom]100ppi is about the sweet spot for monitors, 150 would be pushing it for usefulness, but thats about were you wont see pixles anymore, unless you are trying to find them by getting closer. you able to give me a name to look for? i know i have seen higher resolution monitors for cheap, but that was a 1440p and i dont want a 16:9 i want a 16:10[/citation]
You can see substantially higher resolution than 100ppi - most people don't realize this however, since such monitors aren't commonly available. Even 150 isn't at the limit of the eye's visual acuity (even at the distance of a desktop monitor), but it's getting a lot closer. Realistically, the 300-400ppi seen in the iPad and many smartphones would be unnecessary, but certainly 200ppi would be nice. That means 3840*2400 (twice 1920x1200 in each dimension) in a 22 inch monitor, or 5120x3200 (twice 2560x1600 in each dimension) for a 30 incher.

Admittedly, there are a couple of current issues with this, however, they really shouldn't be insurmountable. Here's the problems (and solutions) as I see them:

1) Scaling. This is really the big one, IMHO. The operating system needs to render everything in a pixel-independent manner, and then scale it based on the monitor's DPI. Not simply font scaling, but full UI scaling. There's no reason why text should be smaller at 5120x3200 than it is at 2560x1600 on the same size monitor - it should use the extra pixels to improve clarity and sharpness, not add more room on the screen. As an added bonus, with truly high pixel density and good OS scaling, you could adjust the size of UI elements to fit as much (or as little) on the screen as you wanted, depending on visual acuity and user preference.

2) Graphics performance. I'll split this one in two:

2a) Desktop/productivity graphics performance. There's no reason at all why even the integrated graphics in a modern laptop couldn't push something like HD video or basic desktop applications, even at stupidly high resolutions. A modern intel chip can play back a blu ray smoothly (high, high bitrate 1080p) with single digit CPU usage, and basic upscaling is not terribly processor intensive. Advanced upscaling isn't as necessary when the pixel density is high either, since pixel-sized artifacts are less visible.

2b) Gaming. This is a problem admittedly, but all you would need to do is turn down the resolution. Also, anti-aliasing would be completely unnecessary. As pixels shrink down to the limits of visual acuity, the jagged edges caused by aliasing also shrink down to that limit, so AA becomes superfluous. In addition, extremely high resolution, high PPI monitors are better able to emulate non-native resolutions without the blurring problems that occur on low-PPI monitors when run at non-native resolutions, and they even effectively have multiple "native resolutions". For example, a 3840x2400 monitor also has a perfect image (effectively a second native resolution) at 1920x1200, since it's a perfect pixel doubling (which requires almost zero processing to achieve as well). Would graphics cards be able to run new games at full res? Likely not, unless you have an awesome >>$1k multi card top of the line setup, but you wouldn't need to run full res to get an equivalent gaming experience to what you already have (and for the games you could run at the higher resolution, that option would be available, which would further improve the gaming experience).

3) Connections. No current connection could hope to supply the bandwidth for some of these ultra high res screens. However, I really doubt this is a problem - I think the only reason such a connection doesn't exist is because nothing right now would take advantage of it. If monitors started to become available with the extremely high resolutions mentioned here, a proper connection standard would be a pretty minor technical hurdle.


Oh, and finally, I'll throw my hat in with the vote to keep 16x10 aspect ratio (though it may already be too late given the way the industry is heading). For the rare times you are editing movies, you can deal with the black bars, and if you're in a professional studio or something like that, you probably will output to a TV rather than a monitor anyways, since that's the target media (and you want to see how it will look to the majority of end users). For everything that isn't movie editing, the extra vertical space is really helpful. You can fit more text on the page. You can see more of the photo you are editing (since photos tend to be fairly close to 4:3 aspect ratio). It even provides a larger viewable area for the same diagonal, so a 17 inch 16:9 screen actually has less screen area than a 17 inch 16:10, even though they are rated to be the "same" size.
 
G

Guest

Guest
Lower resolutions are great for gaming though for people who have mid range PCs that have to last a few years.
 

john_4

Honorable
Feb 27, 2012
367
0
10,780
[citation][nom]g00fysmiley[/nom]2560x1440 for an 11 inch screen seems... cool but kind of pointles si dont' think i could tell the difference between that and even 1080p or even 720 by much if anything... now in larger screens like 17" plus bring it on![/citation]
Obviously you have not looked at a retina display from Apple.
 

Pherule

Distinguished
Aug 26, 2010
591
0
19,010
[citation][nom]Article[/nom]There will also be some other tiers with tweeners and higher resolutions. Intel recently noted that it would expect 11-inch notebooks to see 2560x1440 resolutions, 13-inch models 2800x1800 and 15-inch and above 3840x2160.[/citation]
Oh please, NO!

Stick with one standard, be it 2560x1440 or 2560x1600, I don't care, just STICK WITH ONE!
 

serendipiti

Distinguished
Aug 9, 2010
152
0
18,680
Having the screen as one of the most power hungry device in the notebook, once notebook makers got a good enough resolution, lowering the power (=increase battery life) was in their bill more than getting higher res...
 

nebun

Distinguished
Oct 20, 2008
2,840
0
20,810
and we need this resolution because???...just think of the power required to light up all those pixels, not to mention the power needed to run the hardware to fill those pixels....you can forget about battery life
 

zshazz

Distinguished
Oct 23, 2011
14
0
18,510
[citation][nom]nebun[/nom]just think of the power required to light up all those pixels[/citation]

Basically nothing. and . The backlight is what consumes nearly all of the power in such displays. Increasing resolution would, therefore, have an insignificant effect on power consumption.

: they themselves emit light. However, there's no data to suggest that more pixels would increase power consumption significantly, even for them. One OLED pixel producing white light shouldn't consume significantly less power than four OLED pixels a quarter of the size each.

the power needed to run the hardware to fill those pixels

Indeed, the hardware requirements to fill those pixels would be higher. But again, the power consumption would still be relatively insignificant when compared to the power consumption of a backlight.
 

zshazz

Distinguished
Oct 23, 2011
14
0
18,510
Well, interesting... it modified my post, removing links to sources (and all the text that was linked).

Basically nothing. and .

Should be (paraphrasing): "Basically nothing. Backlights are what emit light in an LCD display and LCD pixels filter this light." With links to Wikipedia: Backlights and Wikipedia: LCD displays.

: they themselves emit light.

Should be: "OLED displays are different: they themselves emit light." With link to Wikipedia: OLED.
 

f-14

Distinguished
been saying this for YEARS (not that 15" screens need it) how long have you been able to plug in your graphics cards to 40"+ tvs or monitors larger than 24"'s?

how long have we had dual monitor, and triple monitor?

a lot of slacking going on probably by sony blu-ray's command if they wanted blu ray to work with their products.
anybody who says they are happy with 1080p i ground you to 640x480 for the next 10 yearsjust so you have a clue what i'm talking about!
 

nameon

Distinguished
Feb 10, 2010
137
0
18,680
Im gonna get scorched for this, but oh well....

Am i the only one who enjoys playing games on my old dell square monitor at 720p than my samsung 1080p monitor?

im making mu gpu last longer #HD69502GB#
 

Jay-Z

Honorable
Sep 29, 2012
416
0
10,810
Not only should higher resolutions be mainstream for desktop monitors too, but also affordable. Sometimes I feel that monitors get left out in favour of mobile devices.
 

shadowfamicom

Honorable
May 2, 2012
136
0
10,680
[citation][nom]scooterboi[/nom]Apple released very high res display. Tom's comment section: Meh, Pointless, who the F can read those texts, HAHAHA the beginning of apple's downfall, enjoy 5 fps in anything appledrone LOLOL.Torvalds pushing for very hi-res display to be the new standard. Tom's comment section: FABULOUS, fantastic!!, oh god I've dreamed about this for sooo long, now that's what i'm talking about, totally agree with Linus he's my prophet anyway.Hypocrites at its finest.[/citation]

So true!
 

nameon

Distinguished
Feb 10, 2010
137
0
18,680
[citation][nom]Jay-Z[/nom]Not only should higher resolutions be mainstream for desktop monitors too, but also affordable. Sometimes I feel that monitors get left out in favour of mobile devices.[/citation]

Jay-Z?
 

Jay-Z

Honorable
Sep 29, 2012
416
0
10,810


Haha, that's my nickname. I've always been called that due to my initials. However, I love (and perform) classical music. Apparently there is a rapper of the same name. :lol:
 

guardianangel42

Distinguished
Jan 18, 2010
554
0
18,990
I'm actually perfectly fine with 1366x768 resolution displays in laptops. It means that the limited muscle of a dedicated GPU can go further. I can have advanced lighting, textures, water, particle, and whatever else effects in games and only suffer from a lower native resolution (meaning larger HUD elements and text).

Frankly, on a 15 inch laptop, from this frugal gamer's perspective, lower native resolutions are a good thing. It means I need to spend less to get a mostly comparable experience to a desktop rig.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]cjl[/nom]You can see substantially higher resolution than 100ppi - most people don't realize this however, since such monitors aren't commonly available. Even 150 isn't at the limit of the eye's visual acuity (even at the distance of a desktop monitor), but it's getting a lot closer. Realistically, the 300-400ppi seen in the iPad and many smartphones would be unnecessary, but certainly 200ppi would be nice. That means 3840*2400 (twice 1920x1200 in each dimension) in a 22 inch monitor, or 5120x3200 (twice 2560x1600 in each dimension) for a 30 incher. Admittedly, there are a couple of current issues with this, however, they really shouldn't be insurmountable. Here's the problems (and solutions) as I see them:1) Scaling. This is really the big one, IMHO. The operating system needs to render everything in a pixel-independent manner, and then scale it based on the monitor's DPI. Not simply font scaling, but full UI scaling. There's no reason why text should be smaller at 5120x3200 than it is at 2560x1600 on the same size monitor - it should use the extra pixels to improve clarity and sharpness, not add more room on the screen. As an added bonus, with truly high pixel density and good OS scaling, you could adjust the size of UI elements to fit as much (or as little) on the screen as you wanted, depending on visual acuity and user preference.2) Graphics performance. I'll split this one in two:2a) Desktop/productivity graphics performance. There's no reason at all why even the integrated graphics in a modern laptop couldn't push something like HD video or basic desktop applications, even at stupidly high resolutions. A modern intel chip can play back a blu ray smoothly (high, high bitrate 1080p) with single digit CPU usage, and basic upscaling is not terribly processor intensive. Advanced upscaling isn't as necessary when the pixel density is high either, since pixel-sized artifacts are less visible.2b) Gaming. This is a problem admittedly, but all you would need to do is turn down the resolution. Also, anti-aliasing would be completely unnecessary. As pixels shrink down to the limits of visual acuity, the jagged edges caused by aliasing also shrink down to that limit, so AA becomes superfluous. In addition, extremely high resolution, high PPI monitors are better able to emulate non-native resolutions without the blurring problems that occur on low-PPI monitors when run at non-native resolutions, and they even effectively have multiple "native resolutions". For example, a 3840x2400 monitor also has a perfect image (effectively a second native resolution) at 1920x1200, since it's a perfect pixel doubling (which requires almost zero processing to achieve as well). Would graphics cards be able to run new games at full res? Likely not, unless you have an awesome >>$1k multi card top of the line setup, but you wouldn't need to run full res to get an equivalent gaming experience to what you already have (and for the games you could run at the higher resolution, that option would be available, which would further improve the gaming experience). 3) Connections. No current connection could hope to supply the bandwidth for some of these ultra high res screens. However, I really doubt this is a problem - I think the only reason such a connection doesn't exist is because nothing right now would take advantage of it. If monitors started to become available with the extremely high resolutions mentioned here, a proper connection standard would be a pretty minor technical hurdle.Oh, and finally, I'll throw my hat in with the vote to keep 16x10 aspect ratio (though it may already be too late given the way the industry is heading). For the rare times you are editing movies, you can deal with the black bars, and if you're in a professional studio or something like that, you probably will output to a TV rather than a monitor anyways, since that's the target media (and you want to see how it will look to the majority of end users). For everything that isn't movie editing, the extra vertical space is really helpful. You can fit more text on the page. You can see more of the photo you are editing (since photos tend to be fairly close to 4:3 aspect ratio). It even provides a larger viewable area for the same diagonal, so a 17 inch 16:9 screen actually has less screen area than a 17 inch 16:10, even though they are rated to be the "same" size.[/citation]

i know we can technicaly see more, books are printed at 600dpi because of how close we read them, now this may be a subjective thing, but i got a tape measure and did this just to see where i no longer notice pixles.
im looking at the startmenu for this with a white background
12 foot 95dpi i notice pixles
20 inches i think i only notice pixles because of high contrast
32 inches, normal viewing distance for me while typeing, i dont know if i can see them, or if i think i see them because i know they are there
51 inches, normal relaxedreading/watching crap distance for me, i cant see pixles even though i know they are there.

when i think of when i see aliasing in games (i play at 1920x1200 full screen and 1920x1080 windowed) its always with high HIGH contrast areas of games, i never see it otherwise.

for me normal typeing distance is as i said my head is 32 inches from the screen, and i am unsure if i see pixles or if its because i know they are there. with an extra 50 to the dpi, i dont think i would even be wondering if i see it or not.
 
G

Guest

Guest
Ugh...just imagine the graphics hardware you'd need to play games (max settings+max AA and all necessary post processing filters) at that resolution. I have a GTX570 in my main desktop computer and even that has issues with some of the latest games with all the goodies maxed out.

High res is wonderful, but just even a small bump in resolution can mean a big drop in performance. On my laptop, dropping the resolution from 1366x768 to 1280x720 gives me an additional 10-15fps.
 

scannall

Distinguished
Jan 28, 2012
354
0
18,810
[citation][nom]imcommentinghere[/nom]Ugh...just imagine the graphics hardware you'd need to play games (max settings+max AA and all necessary post processing filters) at that resolution. I have a GTX570 in my main desktop computer and even that has issues with some of the latest games with all the goodies maxed out. High res is wonderful, but just even a small bump in resolution can mean a big drop in performance. On my laptop, dropping the resolution from 1366x768 to 1280x720 gives me an additional 10-15fps.[/citation]


With resolution that high you no longer need AA, and that's one of the bigger hogs.
 
Status
Not open for further replies.