Discussion Will we see 8k resolution at the new iMac?

Status
Not open for further replies.

MaTech

Reputable
Jun 28, 2015
15
0
4,510
Hello, good afternoon.

Probably you know that in the CES 2015 event, Sharp and other brands have introduced the world highest resolution: 8k.

Well, LG announced that a new iMac 8k will arrive this year. You can see it here: http://9to5mac.com/2015/04/06/lg-publicly-claims-apple-will-release-imac-8k-with-8k-screen-this-year/.

This can be a Little bit stupid because actually there are no components to support that resolution fluently (But Windows 10 will). That's why I think this is a Little bit strange.

I'd like to know if this is true because if I'm right LG posted that in the April Fool day. Also, when you try to enter the official link to that post (to verify if the information is true) YOU DON'T HAVE PERMISSION: April Fools?

Do you think that Apple is going to launch an 8k iMac?
And also I'd like to know why we cannot see the LG oficial page where it says that stuff (http://lgdnewsroom.com/). The page is like "invisible".

April fools?


lg-8k-imac-2015-04-07-03.jpg

This is the image to the oficial post by LG.

A few hours ago I was trying to see why cannot we see the content of that page and the browser says: "The browser can connect to the page, but the web rejected to show information, you must be logged to see the info". If you try to connect you'll see that the web is totally White (you don't have permission to watch the page).

Open Google Translator and try to translate that page to another language. Your browser will told you the message I writte before: "You must be logged to watch the content".

What's happening? Did LG make a mistake?


I hope you can help me.
Thanks.





 
I'm not paying attention to 4k at this point since 144 Hz at 4k is not possible since no connection (cable) supports it. I expect we will see GFX cards capable of supporting 4k at 144 Hz and delivering 60+ fps by by XMas 2016. As for 8k.... maybe 2020.
 

MaTech

Reputable
Jun 28, 2015
15
0
4,510
"I'm not paying attention to 4k at this point since 144 Hz at 4k is not possible since no connection (cable) supports it. I expect we will see GFX cards capable of supporting 4k at 144 Hz and delivering 60+ fps by by XMas 2016. As for 8k.... maybe 2020. "

Well you're right, but as you can see LG announced that Apple is making the first 8k display in the new iMac, we won't have to wait until 2020. I'm sure in 2020 and up cartoons and everything else will be at 4k resolution or more.
Gaming at iMac 8k @ 20 fps is going to be imposible unless they use awesome technology so the iMac would cost 9999$.

"Most people I know have eyesight too poor to see the difference between HD and 4K ".

I'm sorry but I don't agree. There are a lot of detail difference between HD and 4k, but the people you know probably have a poor 4k monitor. Since a few months I'm using a 4k monitor and there's a lot of detail difference.

Also, we are talking about the 8k iMac that LG has just announced!

I just want to know if it's true that they are making an 8k iMac.
I know 8k will mean a lot of problems for GPU, etc...

Thanks for answers.

Greetings.
 
Well I agree come 2020, 4k may be the 1080p of today. But it's not 2020 yet. Someday they will make higher resolutions but they will be meaningless until technology on the desktop exists to support it.

Two Titan X's in SLI still can't drive 4k at 60+ fps across today's must have games and no technology exists to support 4k @ 144 Hz. It's fun to speculate, but the point I am making is that until we have DP 1.3, and GFX technology to fulfil its promise, I'm not going to be interested in 4k let alone 8k. Today I'd take the 1440P IPS Predator w/ G-Sync and ULMB over any 4k monitor on the market w/o a thought.
 

20/20 vision is the ability to resolve a line pair with one arc-minute (1/60th degree) separation, or 120 pixels per degree. For a monitor viewed from 3 feet away, that's about 191 ppi. A 24" 1080p monitor has just 92 ppi, while a 24" 4k monitor has 184 ppi. So yeah, the difference between a 1080p and 4k monitor is easily discernible in regular computer use.

What he said is true for TVs however. For a typical TV viewing distance of 10 feet, the resolving limit is 57 ppi. A 50" 1080p TV is 44 ppi. A 4k TV is 88 ppi. So most of the extra resolution of a 4k TV is not discernible at this distance. Add to that, most people don't have 20/20 vision and aren't exactly picky about screen details, so they're not going to be able to tell the difference.

Also note than a 24" 8k monitor (7680x4320) would have 367 ppi which would be pointless at 3 feet. You'd have to lean in so your eyes were 19" away from the screen, or the monitor would need to be 46" @ 3 feet for all that extra resolution to be useful.
 
3 feet away ? I couldn't reach the KB let alone read 6 point text :) Windows was based upon 96 dpi....

1) logic being 1/3 more than the 72 dpi just like the normal viewing distances for monitors were 1/3 more than paper. Normal reading distance for a book is about the distance from your elbows to your fingers (15 - 21" for most). For monitors, it's 20 - 28".

2) With 20/20 vision, most peeps can start to discern individual pixels at around 96 dpi. Get much below that and the image looks grainy. One reason why 1080p 27" monitors are not recommended.

But there's more than ppi involved here. The extra pixels allow the creation of more accurate shading and color as well as less ghosting. Much like a photo printer with 8 colors can create better pics than a normal printer with just 3 colors. And with higher frame rates / refresh rates, ghosting is reduced as to move an inch across screen horizontally or vertically, you have twice as many pixels to absorb that motion.
 

MaTech

Reputable
Jun 28, 2015
15
0
4,510
Thank you for comments, but do you know if it's true that they are creating a new 8k iMac?

I don't like iMacs at all, but if the iMac 8k is launched Dell will too [Like happened with iMac 5k]

I think it can happen three things:

1. Apple already knows that today there are not enough technology to run 8k fluently and they are waiting.
2. Apple make it's iMac with the technology we have today.
3. Apple use awesome technology but the iMac would cost 9999$.

Well, but I need to know if this is true and WHY when you try to enter the official page to that post (lgdnewsroom.com) it says: "Browser can connect to web but it rejected to display information, you must be logged in)

WHY?
Did LG make a mistake?


Greetings.
 

Please try for a moment to imagine any scenario where the distance from your elbows to your fingers matches the distance from your eyes to a book you're holding. Assuming your fingers are holding the book, try to touch your eyes with your elbows.

OSHA recommends a viewing distance of 20"-40". As you get older, you'll find yourself at the upper end of that range.
https://www.osha.gov/SLTC/etools/computerworkstations/components_monitors.html

But there's more than ppi involved here. The extra pixels allow the creation of more accurate shading and color as well as less ghosting. Much like a photo printer with 8 colors can create better pics than a normal printer with just 3 colors.
Photo printers use more than 3 colors because inks are subtractive, whereas light is additive. That's why the monitor industry uses RGB while the printing industry uses CMYK. Photo printers are kind of stuck in the middle because people want to preview their photos on RGB monitors, but the final output is subtractive like CMYK. They have to use all sorts of tricks to bridge the difference to make the two images match, including using extra inks.

It has nothing to do with pixel resolution or higher PPI screens. There's a lot of misconceptions about high-PPI displays out there because of Apple's misleading advertising about their Retina displays (parroted by a legion of faithful pro-Mac review sites who don't really understand the technology). Truthfully, Apple needed the higher PPI displays because OS X doesn't do subpixel rendering. Hook a monitor up to a Mac and draw a diagonal line, and the smoothest it can do is:

RGB rgb rgb
rgb RGB rgb
rgb rgb RGB

On Windows the same monitor can draw:

RGB rgb rgb
rGB Rgb rgb
rgB RGb rgb
rgb RGB rgb
rgb rGB Rgb
rgb rgB RGb
rgb rgb RGB

Effectively tripling your horizontal resolution resulting in much smoother appearance of things like fonts. Hook a Mac and Windows PC up to the same monitor and you can see this for yourself - the PC's fonts look substantially smoother with ClearType turned on (ClearType = subpixel rendering). Or put another way, you will not see as big an improvement in font clarity with a 4k monitor hooked up to a PC as you would with it hooked up to a Mac, because the Mac is starting off with a much worse image.
https://www.grc.com/ct/ctwhat.htm

OS X doesn't do subpixel rendering because it slightly messes up the spacing between virtual pixels, due to the physical spacing between the pixels and subpixels being inconsistent across different monitors (why you have to run the ClearType tuning tool to optimize the display). That tiny spacing is very important for things like accurate font kerning. Due to the high usage of Macs among people in the publishing industry, Apple (correctly IMHO) made the decision not to do subpixel rendering, thus presenting a blurrier image but with more accurate spacing. Without subpixel rendering, the only way to improve the appearance of fonts was to move to a higher resolution display. Which is what they did with the Retina screens.

Tablets and phones do not do subpixel rendering either, because the subpixel layout is different on RGB screens depending on if you view in landscape or portrait mode. That's one of the rationales for Pentile displays - the RGBG layout of Pentile is symmetric in both landscape and portrait orientations, allowing you to do subpixel rendering on these devices.
http://pentileblog.com/wp-content/uploads/2013/04/Diamond-PenTile-Layout.jpg

In other words, most of the stuff you hear about Retina and high-PPI displays is just hype. They are not magical, they were made to address shortcomings in the display technology used by the operating system, that were not present in the typical Windows box. That's not to say there isn't a benefit to having a 4k monitor. For monitors in particular, RGB subpixel rendering only works in the horizontal axis. The vertical resolution is still the same 1080p. Just that a lot of the benefit you hear about them is exaggerated marketing tripe.

And with higher frame rates / refresh rates, ghosting is reduced as to move an inch across screen horizontally or vertically, you have twice as many pixels to absorb that motion.
Ghosting happens when pixels physically cannot change their color to a new value quickly enough. This is independent of resolution. If a white bar needs to move 1 inch on your screen from one frame to the next, on the 92 PPI monitor this means 92 pixels need to change from white to black in one frame. On the 196 PPI monitor it means 196 pixels need to change from white to black in one frame. Both monitors will take the same amount of time to flip those pixels, and suffer the same amount of ghosting, all other things (like refresh rate) being equal.

If there is a reduction in ghosting, it would be caused by the pixels physically being smaller (less liquid crystal material to change polarity). Not by the increased resolution.


We're more than happy to speculate about future products. But if any of us actually knew what Apple, LG, or Dell planned, we would probably be under NDA and be unable to discuss it.
 


Umm.... any scenario in 2 or 3 dimensional space .?

abc_triangle.png


1. a is the same distance to b as b is to c no ?

2. is a touching b or c ?

Or how about Tom lives 1.4 miles from Harry; Tom lives 1.4 miles from Dick, by this logic, Dick must live with Harry ?

Or just how about I am laying on the grass under a tree, my head is on the ground at b, my elbows are at c and the book is at a

OSHA recommends a viewing distance of 20"-40". As you get older, you'll find yourself at the upper end of that range.

I'm 60. My Dad's 84 and MIL is 83. My kids are 19 - 27 ... every one has their chair in pretty much the same place. In the office, same thing. The standard return on a desk / workstation is 16" deep and the size of your typical 24" screen base results in the monitor sitting 8 - 9" from back edge of desk / wall. The puts the front edge of desk. That puts the screen 8" back from the edge of the desk. At 40", your body would have to be 32" from the desk I have never seen that in actual use and I'm in a lot of offices.

On the desktop, I am at 23-24"

On my lappie, I'm at 19-21".... question the top row of keys is 2" from the screen; at 40" how do i reach the KB ? My arms from pits to base of fingers is only 24", I don't type with my arms straight out . Comfortably my fingertips are about 12" - 18" away from my body. The screen is 2" (bottom) - 3" (top) away from the last row on keyboard.

It has nothing to do with pixel resolution or higher PPI screens. There's a lot of misconceptions about high-PPI displays out there because of Apple's misleading advertising about their Retina displays (parroted by a legion of faithful pro-Mac review sites who don't really understand the technology). Truthfully, Apple needed the higher PPI displays because OS X doesn't do subpixel rendering. Hook a monitor up to a Mac and draw a diagonal line, and the smoothest it can do is:

Two issues:

1. What about yellow pixels ?

QuadPixel technology adds a yellow pixel, which Sharp says raises the number of reproducible colors into the trillions.

"A yellow sub-pixel enables more light to pass through the system, which requires less intensive backlighting," Sharp explained in a statement to TechNewsDaily. "The obvious advantage here is a more environmentally friendly overall TV system thanks to lower power consumption. One of Sharp’s core principle is prioritizing technologies that are less power consuming and therefore have a smaller overall carbon footprint."

2. When we look at 1080p, we can see a pixel at 96 dpi So you have that RGB thing going on. Now, in thr same space, you have

RGB RGB
RGB RGB

The combination of cole you can get out of 4 x RGB is waayyyyyy mor than you can get out if the single RGB cluster

And with higher frame rates / refresh rates, ghosting is reduced as to move an inch across screen horizontally or vertically, you have twice as many pixels to absorb that motion.

Ghosting happens when pixels physically cannot change their color to a new value quickly enough. This is independent of resolution. If a white bar needs to move 1 inch on your screen from one frame to the next, on the 92 PPI monitor this means 92 pixels need to change from white to black in one frame. On the 196 PPI monitor it means 196 pixels need to change from white to black in one frame. Both monitors will take the same amount of time to flip those pixels, and suffer the same amount of ghosting, all other things (like refresh rate) being equal.

If there is a reduction in ghosting, it would be caused by the pixels physically being smaller (less liquid crystal material to change polarity). Not by the increased resolution.

It's not true ghosting in the technical sense as with regard to screen measurements for response time but when you walk up close to a large screen 1080p TV, you see a lot of blurring as if you were seeing double because the pixel matrix has difficulty accurately displaying the large image due the digital representation. Just as digitizing presents a digital approximation of the analog sound wave, it is a series of dots rather than a smooth curve.

We have all played with a "connect the dots" coloring book when we were kids. If I uses graph paper to plot a picture mickey mouse on 20 squares per inch graph paper, the image is a lot clearer if I did it at 40 squares per inch. Do a flippy book with a series of images and the 40 will look a lot clearer and less jumpy than the 20. When you see an elbow at a person's side and then it's up near their head, the brain still retains the ghost of the previous image in your head. Increased detail and increased frame rates both help to minimize this.
 

That's only true when abc is an equilateral triangle (that is, angle a = angle b = angle c. In the vast majority of cases this is not true, it is not an equilateral triangle, and distance ab can vary anywhere from zero to twice ac (assuming ac = bc).

In other words, Dick can live anywhere along a circle of 1.4 mile radius centered on Tom. That circle means Dick can be anywhere from 0 to 2.8 miles from Harry.

OSHA recommends a viewing distance of 20"-40". As you get older, you'll find yourself at the upper end of that range.

I'm 60. My Dad's 84 and MIL is 83. My kids are 19 - 27 ... every one has their chair in pretty much the same place. In the office, same thing. The standard return on a desk / workstation is 16" deep and the size of your typical 24" screen base results in the monitor sitting 8 - 9" from back edge of desk / wall. The puts the front edge of desk. That puts the screen 8" back from the edge of the desk. At 40", your body would have to be 32" from the desk I have never seen that in actual use and I'm in a lot of offices.
Most desks I've seen are 24"-30" deep.. The only 16" desks I've seen were the little ones built into university chairs.
http://www.fas.harvard.edu/~loebinfo/loebinfo/Proportions/furniture.html

Assuming the desk is against a wall, most people put the monitor pretty close to the wall. So figure about 20" from the monitor to the front edge of the desk is more realistic. So if you sat perfectly upright as close as you can to the desk, that would be 20" from eyes to monitor. Most people sit with about 6-12" of separation from the chest to the front of the desk. And they sit leaning back slightly in their chair (I know some people lean forward - you're not supposed to, it's bad for your back). So I'd say about 2-3 feet is typical. The slide-out keyboard trays found on some desks increases the distance even more.

Anyhow, the exact distance is personal preference. The numbers I gave were illustrative. The fact that you can tell the difference between 4k and 1080p at 36" means you can definitely tell the difference at 20" if that's your preferred viewing distance.

1. What about yellow pixels ?
Yellow isn't a primary color, so it can be generated on a monitor using a mix of red and green. The complication with yellow is that the cones in our eyes (the parts that see color) do not see exactly red, green, and blue. Each responds to a variety of different wavelengths, and it's actually our brains which combine them to generate a color. If you look at the spectral response of the cones:
http://web.atmos.ucla.edu/~fovell/AS3/theory_of_color.html

You'll see that the "red" cones actually response most strongly to orange-yellow light. What you perceive as "red" is actually a combination of a strong red cone response, a weak green cone response, and zero blue cone response. Your brain puts that together and decides, "Oh that must be red."

An RGB color space carves out a triangle of colors it can generate in a CIE chart. However, due to the nonlinear response of the eye's cones, the colors you can actually see do not fall entirely within that triangle. It bulges out slightly on the blue-green and green-red edges. The blue-green edge (cyan) is not as important because your eyes suck at seeing blue. But the bulge at the green-red edge means that you can actually see a bit more yellow than can be generated with an RGB triad.
https://en.wikipedia.org/wiki/Adobe_RGB_color_space#/media/File:CIExy1931_AdobeRGB.png

Several manufacturers have tried to account for this by adding a yellow subpixel. But the effect is subtle enough that most people can't really tell the difference, and it's never taken off in the market. A similar problem happens with purple. If you look back at the cone spectral response graph, you'll notice your "red" cones actually respond to purple light. Consequently, when you see a response by your blue and red cones, but not your green cones, your brain interprets this as purple. But real-life things that stimulate your red cones usually also stimulate your green. And the combined red, green, blue stimulation usually results in a muddy brown. RGB doesn't have this limitation - the red pixels can stimulate your red cones without stimulating your green cones at all Consequently, the purples you see on a screen can appear stronger and more vivid compared to real life (color film had a similar but opposite problem, and purples never looked quite right).

2. When we look at 1080p, we can see a pixel at 96 dpi So you have that RGB thing going on. Now, in thr same space, you have

RGB RGB
RGB RGB

The combination of cole you can get out of 4 x RGB is waayyyyyy mor than you can get out if the single RGB cluster
Assuming we're talking about a decent LCD with 8-bit or 10-bit color depth, the number of shades that can be generated by a single RGB triad already exceeds the number of shades your eyes can detect. Adding more shades by blending four RGB pixels doesn't magically increase the number of shades your eyes can see.

It's not true ghosting in the technical sense as with regard to screen measurements for response time but when you walk up close to a large screen 1080p TV, you see a lot of blurring as if you were seeing double because the pixel matrix has difficulty accurately displaying the large image due the digital representation. Just as digitizing presents a digital approximation of the analog sound wave, it is a series of dots rather than a smooth curve.
I'm not sure what you mean by the first part. But your last sentence is a common misconception about digital sampling. The digital "approximation" of an analog sound wave is in fact just as smooth as the analog original. It's just drawn as a stairstep to make it obvious what the digital value is. The waveform is not actually a stairstep.

https://www.youtube.com/watch?v=cIQ9IXSUzuM

I really suggest watching the video. It clears up a lot of misconceptions people have about digital signals.
 


No, it's not only true with an equilateral triangle ... let's go back

Please try for a moment to imagine any scenario where the distance from your elbows to your fingers matches the distance from your eyes to a book you're holding. Assuming your fingers are holding the book, try to touch your eyes with your elbows.

Normal reading distance for a book is about the distance from your elbows to your fingers (15 - 21" for most).

You did say any scenario right ? So an equilateral triangle w/ 15" is one scenario so we disproved the premise. But I can also imagine two sides being 14.95 and 15.0 ....and 15.0 and 15.05 and any of the multitude of combinations with one number between 15 - 18" and the other between 15 - 18".

Here again proved the point. If Dick can live anywhere along that circle, than it is in no way required that they be touching, as you required in your elbows and eyes example. It's not possible to have a triangle when any two of those points touching each other.... that's not a triangle, it's a line.

Assuming the desk is against a wall, most people put the monitor pretty close to the wall. So figure about 20" from the monitor to the front edge of the desk is more realistic. So if you sat perfectly upright as close as you can to the desk, that would be 20" from eyes to monitor. Most people sit with about 6-12" of separation from the chest to the front of the desk. And they sit leaning back slightly in their chair (I know some people lean forward - you're not supposed to, it's bad for your back). So I'd say about 2-3 feet is typical. The slide-out keyboard trays found on some desks increases the distance even more.

We have to address office and scenarios differently. In an office, putting the screen 20" from the front if the desks, leaves 9 - 10" of desk unaccessible.... no one does that, desk real estate is at too high a premium. That's why, in an office setting w/ 29-30" desks, computers are rarely put "on the desk", they are put on the desk return. Desks will have a depth of 24- 30" and the return where the keyboard draw is mounted a TOTAL of 16 - 24" deep. At 20" from the front of the desk,that puts the screen anywhere from 4" from the edge of the desk or 4" off of it. I just measured the distance from the back edge of the stand on my VG248QE to the front of screen and it's 7".

20" + 7" puts my monitor stand 3" into the wall on a 24" return
20" + 7" puts my monitor stand 11" into the wall on a 16" return ... in fact the screen is 4" into the wall.

The return on my office manager's desk is 18" deep but it has a 1" back which sticks up to stop stuff from getting pushed off the back of the desk.... so she has 17" (screen would be 3" into wall, stand 10" into the wall). The next desk has a 20" return (stand 7" into wall). Mine is 18" deep but has rounded edges so 16 usable .... my stand would be 9" into wall..... the 3 kids desks are 17" deep but they get a bit deeper above the KB draw....I just measured their's one screen is 12" from front of desk, the other 2" are 11" from the front of the desk.

Bedroom desks are kind of tight as big desks don't fit in 12 x 14 bedrooms ... and will run 20 - 24"

http://www.amazon.com/South-Shore-Axess-Collection-Black/dp/B003FGWY1O/ref=sr_1_9?ie=UTF8&qid=1437434713&sr=8-9&keywords=Student+Bedroom+Desks
http://www.amazon.com/Harbor-View-Computer-Desk-Hutch/dp/B001DNF26U/ref=pd_sim_sbs_196_2?ie=UTF8&refRID=1TSGMVC9JYWACM5BXD6G

So I think most often, it's less personal preference than space availability

Yellow isn't a primary color, so it can be generated on a monitor using a mix of red and green. You can also do it on a RGB printer but, as I said, using the photo printer analogy, more pixel color options leads to more accurate color...

Assuming we're talking about a decent LCD with 8-bit or 10-bit color depth, the number of shades that can be generated by a single RGB triad already exceeds the number of shades your eyes can detect. Adding more shades by blending four RGB pixels doesn't magically increase the number of shades your eyes can see.

Well most gaming monitors are 6 bit, the expensive TNs like the Asus Swift are 8 bit.... expensive IPS will get ya 10 bit. I will use what I am used to in AutoCAD as it simpler to illustrate. There's the basic 256 colors and then the whole RB thing ... so when detailing the subtle shaded of tone across a persons face for example, you can go from

245

to

243 244
245 246

It's not about whether you can produce a different shade that the eye can't distinguish .... it;s about the fact that you can place 4 shades in a given space instead of 1 ....essentially "smoothing the jaggies".

I posted an image some time back where if you looked at it at normal viewing distances, it was complete indistinguishable, at 32+" you immediately could make it out.

Walk into any high end audio store and see what customers bring in to audit new purchases. Either extremely hi sampling rage digital media (Fiona Apple) or vinyl

http://www.electronicproducts.com/Digital_ICs/Video_Graphics_Audio/Analog_Vs_Digital_Sound.aspx

1.png


Of course when you get the sampling rate and the media density up and up and up... you can get so close that it's extremely hard to tell the difference....but like CGI in movies, oft it takes someone with a trained eye to point out anomalies that once pointed out are immediately obvious ..... and sometimes, as we all know, it's rather obvious. How appropriate Its Sharknado week !

Digital is catching up
The difference between analog and digital sound is no longer so cut and dry. Innovations in analog-to-digital conversion methods have improved the accuracy by which analog sound is replicated. The DVD audio format as one example, allows greater data storage and thus a larger sample rate and bit size. Stop-motion photography can be made into film with a high-enough frame-rate after all. This translate to the digital curve representing the digital sound wave smoothening out.
 

Sigh. You've concocted a contrary answer which is irrelevant to the original question. Yes it is possible to come up with *a* scenario where for a brief instant the distance is identical. In the real world though, the distance is (1) constantly changing, and (2) is not going to be determined by the distance from your elbows to your hands.

Walk into any high end audio store and see what customers bring in to audit new purchases. Either extremely hi sampling rage digital media (Fiona Apple) or vinyl

http://www.electronicproducts.com/Digital_ICs/Video_Graphics_Audio/Analog_Vs_Digital_Sound.aspx

1.png
I'm done here. Clearly you haven't viewed the (very informative) video I posted which explains why that pic is wrong, and are arguing just for the sake of arguing.
 
Status
Not open for further replies.