How To Choose A Monitor

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
"The term is a literal one. It usually takes the form of a slider or series of presets that progressively reduce the brightness of blue in the image. You could produce the same effect by turning down the blue slider in a white balance adjustment. The result is a warmer picture and that can also reduce fatigue when you’re staring at black text on a white screen all day."

Yes and No.

Our eyes don't register blue light as well as the rest of the spectrum. So our eye's pupils have a tendency to open more for a 100 lumen blue source than a 100 lumen red source.

That said our brain perceives blue light as a way of determining if it's day time or not, and the brain is shown to be more active when present in blue light. This contributes to people having problems falling asleep as they look at their tablets late at night.

Efforts are more than just "turning down the blue slider" It's more complex than that. Good "low bluelight monitors" take the blue light and shift it downward to a longer wavelength. A lot of blue light energy is wasted in the stokes shift when creating the green and red set points. Another option is to shift it up to UV and perform the stokes shift there which is even better. But this requires special UV filtering on the panel. The best low blue light monitors use separate R G B LEDs that are tuned to the perfect Adobe & Rec 2020 set point wavelengths, but these get expensive and are left for uber expensive VA professional grade monitors.
 


Here is my first hand experience, from gaming on
1) a gamecube with an hd crt at 720p that was 36 inches right in my face,
2) gaming in the living room with a 42 inch plasma a few feet away from me
3) gaming in the living room with a 60 inch lcd a few feet from me.
4) gaming in my bedroom with a 24 inch tn 60 at 1920x1200
5) gaming on my little brothers 24 inch 144hz ips at 2560x1440 with 40-144hz freesync (the range is FAR larger than this, as we went down to 18fps when the card failed to clock up for the game)

first and foremost, resolution means jack over all unless you get something that is VERY big, look for 100-120 ppi as a best range.
1080p - about 24 inches
1440p - about 28-32 inches
4k - about 40-48 inches

second, monitors all have some level of ghosting (except crt i believe), some can eliminate it or make it VERY unnoticeable by strobing, never seen it, but something to think of, all monitors regardless of what they have, have this issue so don't buy based on less ghosting features unless you can see the strobing first had.

the difference in doing ANYTHING from 60 to 120/144 is immense, this on its own should be number 2 of what you look for in a monitor.

how is it backlight and is there a bleeding issue with this display? no real way to know about this unless you are able to test first hand or know someone who will do it.

for gaming, and this was on the 144 with freesync, i can easily tell the difference between 30 and 60, now easily tell from 60 to 120/144, but free sync... my god does this make lower frame rates just that much more tolerable. we were getting sub 20fps in a game because the gpu didn't clock up (user error) and usually around this fps range i can easily tell something is wrong, was a bit hard to tell on with free sync, people telling you how big a difference this makes alone are not kidding, had i not known a graphics effect was suppose to be smoother than it was showing, i may not have noticed it went below 30

contrast wise... this is the kicker, contrast ratio is king outside of pro graphics work. its nice to have over 70% adobe rgb, but really it means dick all for general use or gaming, on this note, the 144ips and my 6 year old 60lcd, i don't know what to say, something puts me off of his display and i don't know what, the frame rate is much smoother, games feel better to play, but general use, something is off. this leads me to its possible his contrast ratio is worse then mine, making my 60 seem better... not to sure as he wont let me really test it with anything i know the colors and brightness of well.

1) contrast - must be 1000:1 minimum, more is better
2) 120/144hz is more second on the list
3) ppi in a 90-120 range

everything else.

my experience with his ips makes me question if ips are worth getting, he has a xf270hu i beleive, and i just can justify the 350$ price premium over a tn 144 with freesync.
 
I think you should have at least touched some of the (possibly proprietery) variations of LCD technology and put them in perspective, performance vise, with the basic technologies. Since I've been looking somewhat for a new monitor, I see MVA, PVA and others(?). In a few line you could've given the basic pros and cons of such variations, especially since they seem VA variations, which you seem to hold in high regard. Otherwise, OLED seem to be the killer tech. Who needs a 1000 nits in your face anyways?
 
I am having trouble with the concept of a pre-calibrated monitor. While the accuracy of such a monitor might be greater, out of the box, than a monitor that is not pre-calibrated, the only way to ensure that the accuracy of the entire system is as high as possible is to calibrate the monitor after it is attached to your computer. Do pre-calibrated monitors come with a file that can be loaded on your computer so that you can set your computer to the same levels at which the monitor was driven when it was calibrated?

How do we know our computers are actually outputting an accurate signal in the first place? If the computer is not outputting an accurate color signal, then, to me at least, pre-calibration of a monitor will not necessarily increase the accuracy of the overall system.

Someone doing photography work, for instance, should, IMO, photograph a test pattern with each of their cameras, then develop a profile for the monitor based on the photograph of the test pattern for each camera - preferably switching profiles when working with the different cameras.

As I see it, accuracy has been, and likely still is, a fuzzy subject. It depends on an end-to-end calibration, and having one end (such as the monitor) calibrated to some standard is not necessarily going to give overall accuracy to the system.
 
The misunderstanding is that the computer doesn't need to be calibrated. It outputs a digital signal. The same program, with the same OS & software settings, will output the same bits for a given image. Always.

The factory calibrated monitors adjust the incoming signal with their internal electronics. That said, there are often color profiles available for monitors, so that your software knows the capabilities of the monitor (e.g. its color gamut, etc.).

Now, you're correct that monitors do drift out of calibration, over time. A professional should calibrate their monitor(s), regularly. Some high-end monitors include self-calibration, where they display an internally-generated pattern and use either an internal or external sensor to measure the light output.

You wouldn't calibrate your monitor to a single camera. You'd create a profile for each camera. Then, your photo software could use the profile to compensate for the camera's inaccuracies. That way, you can simultaneously edit & combine photos taken with different cameras. More importantly, when you want to send the photos for print, they'd be calibrated to a known standard, allowing predictable output, even if you don't know what kind of printer will be used and the print shop doesn't have your cameras' calibration data.
 
wow, all of that.

My last one, I just went into the store, picked the size I wanted at a good price. yaay, it works!

Well, I checked the resolution and port too, and made sure it did not say samsung. I hate samsung.
 


Actually... NEVER!

Unless your monitor has a programmable LUT, ALL computations for real calibration (not just white point and gamma) must be done in the computer and you DO need a color management file that converts standard colors into a mapped version that will output the closest possible colors on the monitor itself. Throw in environmental calibration (if you have strong, off-color lights or need perfect colors) and now even that factory calibration file won't be good enough!
 
Thanks for muddying the water. Go back and read what wiyosaya was asking about: factory-calibrated monitors. Factory calibration implies that the monitor has a LUT or similar.

I stand by my point - the computer always outputs exactly what it's told, and therefore doesn't need to be calibrated. It can certainly compensate for a dumb monitor, room lighting, etc. Furthermore, you wouldn't perform an end-to-end calibration to adjust your monitor for a single photographic camera, as wiyosaya was suggesting.

I wasn't trying to write a treatise on calibration - just touch on some basic ideas. Feel free to have a go, if you'd like.
 
BenQ makes some very cheap 200 dollar 27 inch 1920x1080 60hz VA monitors. I have no ghosting problems gaming on it and the contrast ratio really is a big step up from ips. Viewing angles damn near identical to ips too.

The 3000:1 over 1000:1 CR makes movie watching and gaming in dark environments so much more enjoyable. The picture looks as if a veil has been lifted and everything is clearer. This imo was the best interim choice while i waited for oled since the investment wasn't much.
 
Status
Not open for further replies.