Display Calibration 101: Step-By-Step With Datacolor's Spyder4Elite

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
In future monitor reviews can you put up comparisons between profiles (when available.) My Dell U3011 came with RGB and Adobe color factory profiles and it would be nice to know how close those profiles are in terms of the manual calibration results.

Cheaper monitors come with the game, movie, etc profiles and comparing those would be great too.
 
Note1: I'm not expert, but I've read and dabbled in these products for about a year now, so take my advice as you see fit.
Note2: I own or have used Spyder HDTV 3 & 4, X-Rite i1Publish with i1Pro 2 meter, Colormunki Photo, & SpectraCal CalMAN with C6 meter.


You mentioned to warm up your monitor, which is rightfully said. But don't forget to warm up your room lighting as well (for the same purposes) if you plan on using the ambient light sensor reading or making any manual adjustments at the end of the calibration process!



I agree. But higher-end monitors come precalibrated, meaning any source which doesn't alter the image/color data will be reproduced properly on one of these displays (or at least as good as the factory calibration provides).



Looking at any before and after pictures on your monitor is a pointless exercise, for two reasons: 1) your monitor would have to be calibrated in the first place (which means you don't need this product), and 2) even if your monitor is calibrated the change in any image on the review's monitor is specific only to his monitor and it is sending a different color to his monitor than what the image data says, so that when his monitor reproduces the (altered) color it actually appears as the intended (original) color as found in the image data (actually this argument holds even if your monitor isn't calibrated).

For example, if the color in the image is gray (R128,G128,B128), and the Spyder meter reads a green-ish color (R110,G128,B110), the color profile will alter the original gray color (to perhaps R140,G128,B140) as it leaves the video card. The monitor then receives a signal/color with more red and blue to compensate for the lack of red and blue in the color reading from the sensor reading, which makes the color displayed to the user closer to the intended gray (R128,G128,B128).



Following my answer to the quote directly above, perhaps it's that your monitor was correct at first and the changes to the review's profile made it worse for you (i.e., made the images duller and too red), but actually made it better for him (i.e., fixed his gamma and added the red his monitor was at first lacking). If you're correctly calibrated and an unaltered image looks worse, then the quoted comment above is valid. But perhaps instead of bad taste, maybe it was the artist's intent. Nevertheless, an artists "taste" can play a big role into what looks good or not as well.

Lastly, many people aren't used to viewing content (e.g., movies, images, your desktop background, etc.) on a properly calibrated display, therefore things will look different and even at times look "wrong" or "off." For example, images and movies might appear too "warm" (i.e., too red) because the user it used to viewing content too "cool" (i.e., too blue) without knowing it. Even the basic UI elements in Windows and Mac OS appear quite different on a properly calibrated display, and that's simply because I'm actually viewing it correctly now!



See my comments above.



TV calibration works differently, at least it does in the terms of using Spyder's HD TV calibration (as far as I am aware, as I used Spyder's products in the past). With those products, you are actually drilling into the menus on the TV (e.g., hue, tint, brightness, contrast, temperature, etc.) and making the changes in the TV as you take readings. The changes are made and saved on the display device (i.e., the TV itself) and not on the connected computer; this is what the first quote above is referring to. Since the settings are saved on the TV (i.e., if you set your TV to "movie mode," turn it off, and turn it back on it'll still be in "movie mode") you can calibrate many TVs using the same software. Be aware that many TVs have settings for each input, so you might have to write down the final settings after calibration on the input you used and duplicate it on the other inputs (if that's your intent).

Note however that if you're using a PC to generate the test images/colors for the calibration process, instead of a downloaded or bundled DVD or Blu-ray movie with the color patterns on it, make sure the first thing you do is disable any color profiles on your computer! If not, the colors on the TV will only be correct when using that PC, and wrong when the Blu-ray player, for example, is connected to that input.

If you have the same PC connected to a monitor as well as your TV, then you run into a bit of a problem. If you're simply using the PC to calibrate the TV, the just look at the warning in the above paragraph. But if you play your content from your PC and watch on the TV AND you have a monitor connected to the PC which you also use, then there's a possible issue. I'm not 100% sure about how Win8 handles multiple monitors and color profiles, but I believe it's the same as Win7. More specifically, it is that Windows has a limitation that only one color profile can be loaded at once and it is applied to all monitors (and TVs, as you're using the TV as a monitor) connected to the same computer. Perhaps it only applies to all monitors connected to a single video card (i.e., if you have two video cards, each with a monitor attached, then two color profiles can be used), but I'm not sure. On the other hand, Mac OS X handles multiple displays much better and you are able to specify a different color profile (or the same if desired) for each display independant of all others. What does this mean? Basically it means that you'll need two color profiles (one for the PC monitor and one for the TV) and only one display will be displaying colors correctly, depending on which color profile you have loaded. So you'll have to switch between the profiles depending on which display you want to be correct in that moment, or have one wrong all the time (i.e., the monitor will always be wrong so you TV can always be correct).

If you don't have the HD TV upgrade, then only the computer connected to the TV will be calibrated properly while the input used for your Blu-ray player will be wrong. With the HD TV upgrade, you are basically calibrating the TV for each input, regardless of what's feeding the input (e.g., Blu-ray/DVD player, cable/satellite set-top-box, XBox/PlayStation, etc.) So for best results with a PC connected to a PC monitor and a TV (with the TV's accuracy having higher priority), use the HDTV calibration process to get the TV to the best possible calibration, then calibrate the PC monitor but use the manual controls (hopefully your monitor has RGB controls). If you create a profile while using the PC monitor calibration, don't use it. At this point the monitor and the TV will be as close to each other as possible while only using control build into the displays themselves. Then go back to the TV and use the monitor calibration process on the TV and use the profile that you just created. This way your TV will be accurate while your PC monitor will be close to it. Perhaps I'm way off in the deep end as I'm no expert, but that's how I think you could do it.



See my multi-monitor limitation comment above; just search for "handles multiple monitors" in my answer immediately above.



Your display should have come with a personalized (i.e., specific to that monitor only) printout from the calibration process at the factory as my Dell monitor did. To achieve best results with one of these displays without owning a calibration device, make sure there is no color profile loaded in your OS settings (Mac defauls to "monitor" or something generic like that), then set the "Preset Mode" of your monitor to sRGB. You're all set/calibrated. Your average DeltaE 2000 error should be <= 3 which is great (this means any changes in color reproduction should no be noticable to a normal person). From the factory my DeltaE for the same monitor was about 2 (according to the print out), but after calibrating I got it down to 1.18 average. Changing any of the other modes will only make it less accurate.


I hope this helps everyone out.
-GWolfman
 


Windows 7 can load separate profile for each monitor.
 
I agree. But higher-end monitors come precalibrated, meaning any source which doesn't alter the image/color data will be reproduced properly on one of these displays (or at least as good as the factory calibration provides).
But the color of your monitor changes over time as the backlights and panels age.

TV calibration works differently, at least it does in the terms of using Spyder's HD TV calibration (as far as I am aware, as I used Spyder's products in the past). With those products, you are actually drilling into the menus on the TV (e.g., hue, tint, brightness, contrast, temperature, etc.) and making the changes in the TV as you take readings. The changes are made and saved on the display device (i.e., the TV itself) and not on the connected computer; this is what the first quote above is referring to.
Spyder2Pro lets you do that. You just have to tick the box that says your monitor has colour sliders.

And yes, Windows 7 (probably 8 too, and I imagine Vista/XP) can run one colour profile per display. I'm doing it now. You have to trick the Express versions of Spyder though, by renaming the actual profiles so it doesn't overwrite every time you calibrate another display.
 
How deeply is the generated LUT embedded into the operating system / graphics hardware? i.e., once installed are you guaranteed that all activities on your computer will make use of it, or is it more application dependent where for example maybe Photoshop will respect the LUT while games and video player might bypass it because they are "writing straight to the hardware"? Also I saw post somewhere else claiming that while Firefox is color calibration aware, IE and Chrome are not? Anyone who can shed light on this?

I've been interested in this overall topic for a while, but in my quick scan of various forum threads feel like I've seen too many complaints of "didn't look right because app <x> not color aware" or "app <x> uses a different color space from your monitor <y> so you're screwed" type thing and have been scared off.

Also I agree with poster above, as a casual home user if I'm going to spend money on a solution I'd really prefer that it covered not only my computer but also all my displays, which in my case includes both Plasma and LCD TVs (that use sources that are not computers so which can't use a LUT.) Seems like this package is not the right one then for multi-purpose home use (no plasma option offered, and it calibrates the computer vs. the display?) Will any of the future articles cover an approach that work for all a home's displays?
 
Several of you have touched on the subject of TV calibration with the Spyder and other tools. The most important thing to get right is pattern generation. You should only use a computer as the pattern source when it is to be the video source i.e. HTPC or movie server. Otherwise, a Blu-ray player is what you want. As others have pointed out, the issue of look-up tables can put your display in a worse state then before calibration unless you're certain the PC is a match for your other sources. There are calibration discs out there that can give you the necessary patterns for calibrating with the Spyder or any other low-priced meter. If you get the patterns right, the results can be very good. And don't overlook free DVD solutions like GetGray. Resolution doesn't matter when measuring color. A DVD will work fine for calibrating a TV or projector.

-Christian
 
Here's a long running thread on an nvidia forum seeming to confirm that the majority of games running full screen will ignore any color profiles. It seems to suggest there is a reliable workaround for Radeon users, and a bunch might-or-might-not-work approaches for Nvidia users:

https://forums.geforce.com/default/topic/501853/geforce-drivers/nvidia-forever-ignoring-custom-color-profile-support-in-full-screen-games-collaboration-thread-on-t/13/

I've also found semi-recent threads indicating that MPC-HD is the only video player that will support color profiles in hardware accelerated playback, and that specifically windows media player will not. Problem is they are all user to user comments so I don't know if they are authoritative or if later updates have fixed it.

Bottom line I continue to worry that while a color professional will doubtless be using a workflow chain that has been entirely vetted to make consistent use of the profiles, it seems like the casual home user will be frequently flipping between use cases that are profile aware and that are not profile aware, possibly leading to more harm done than good.

Maybe this is something Tom's could investigate and include in a subsequent report? Maybe its also worth including as part of the standard checks done while reviewing video cards, since at least in that one thread I linked it appears its an area where AMD provides important functionality that Nvidia does not (something I did not know until now, which is a bummer given I just bought an expensive nvidia card.)
 
Why does a consumer have to do this in the first place. Calibrating the darn screen to show an expected picture should be the manufactures job...

Reviewers have been WAY too accepting of this crap for way too long. Its time to start slamming companies that dont calibrate screens out of the box.
 


This is a question that has been asked many times. On the surface, you're right; displays should come calibrated and users shouldn't have to do anything. In practice, this is not possible for a few reasons.

First, calibration is a time consuming process. Even adding 15 minutes to the manufacturing time of a panel would add to the price. Why do you think factory-calibrated monitors cost more?

Second, every sample is a little different. We publish the settings we use in every monitor review. But they won't work perfectly on every example of a particular monitor. They'll be close; but if I calibrated 10 examples of the AOC I2757FH, I would get 10 different sets of settings.

I have two comparisons I like to make in answer to why is calibration necessary. Does your car come perfectly aligned from the factory? Take it from the showroom to a shop and you'll find out that it doesn't. And it will need periodic readjustment. The second is a piano. Do pianos come tuned from the factory? Sure they're tuned at the factory. Then they're subjected to movement, temperature and humidity changes and God knows what else before they get to your music studio. What's the first thing a pianist does with a new instrument? They have it tuned.

Manufacturers have improved on their out-of-box performance enormously in recent years. And we do compare the stock performance of all our review displays. And we do give kudos to the monitors that measure well before calibration. I'd say that means we don't accept a poorly performing display. We always point out a monitor that doesn't measure well in its stock configuration. Why do you think displays have improved so much?

-Christian
 
Ive wanted to have accurate color on my monitor for a long time. but after reading forums and spec sheets, I found most if not all of the calibration tools just created a color profile for windows to use. Which is fine if your software will use it. Like most of us here, I game a lot and watch movies. only rarely will I open a photo program to do editing -- which is really the only program that will use this color profile. Crysis wont use it.. nor World of Tanks, or Battlefield, and movie software is doubtful... So this $250 device is all but useless for 90% of what we do.

I think win7 (and maybe vista) will use color profiles for the desktop display - which may help for windowed games. but that still means fullscreen games will ignore the profile. again making this thing useless.
So im failing to see how monitor calibration will make non photo-ish programs look better.


Whats needed here is something that will write calibration data directly to the monitor. So I went searching for something like that. Turns out NEC makes monitors that do just this. The problem is they are EXPENSIVE. So i saved for a while & got a NEC PA27-W and a calibration tool. ($1300 birthday present).
After calibration - the image quality was amazing! And since the calibration data was in the monitor, even games looked noticeably better.

The price was well worth it for me. but not everyone can afford $1300 for a monitor.
 


Something we try and highlight in our monitor reviews is which screens offer the most and best controls for calibration. You don't need to write data to the monitor if it has properly engineered adjustments for white point, gamma, and color. We always calibrate our review displays by adjusting the controls only. If a monitor doesn't offer a complete set of adjustments, we point that out.

-Christian

 
Thanks kittle, I was going a little crazy wondering why no one else seemed to care that a tutorial claiming to teach you how to calibrate a display was advocating an approach that would not work for important enthusiast use cases. If this was Tom's Photographer's Guide vs. Tom's Hardware Guide, it'd of course be a different story -- most of the audience probably would not care. But given the content of most of Tom's other articles (think how much of the typical system review is devoted to page after page of gaming metrics), its very strange for that use case to be ignored here.

The good news I guess is that there are hints in the article and more details in the comments that you could use your monitor controls vs. creating a system LUT table; and now heads ups in the comments on why that's the way to go for the readers that see them. For the reader who doesn't though and merely follows the tutorial, I feel they are being led down the wrong path.

Hopefully future articles in this series will be primarily focused on adjusting the monitor (with calibration equipment & software reviewed for its workflow for that case specifically), or at the very last include all the necessary information on how to try to get a LUT to work with nvidia vs. radeon for full screen games and videos. Or just simply include a disclaimer that the article is focused on people doing content creation vs content consumption.
 


On my monitor (properly calibrated from the factory,) the before picture looks washed out and grey. The after picture looks a lot better with more accurate color reproduction.

In other words- a properly calibrated monitor will show the true look of the before / after pictures.
 


This article was meant to focus on the Spyder4Elite solution which, as you say, is primarily about creating a look-up table for the video card. You can calibrate using the monitor's controls with the Spyder but it's really not practical.

Future articles will focus on the use of the monitor's controls. We'll use CalMAN along with the adjustments found on the majority of monitors to demonstrate that calibration method; which is what we do at Tom's for every review. Our monitor reviews now include a thorough rundown of the OSD and how you can use it to calibrate. And of course, the settings for every review have been included since the beginning of the year.

-Christian
 
This article covers the display calibration process well. I own the spyder4 elite, and can attest to the fact that it is very effective at display calibration/optimization. I actually bought it because I could not stand the difference in display color on my dual monitor extended desktop display. Although the displays are not exactly matched now (probably not possible due to one being LCD and the other being LED backlit and 3d capable etc...), they are close enough for it not to bother me anymore.

So, the spyder4 elite is effective at pc display calibration and multiple monitor matching. However, that is the only thing that it is good at.

DO NOT buy the spyder4 elite if you plan to use it in conjunction with the TV HD upgrade. The spyder4 sensor simply does not calibrate HDTV's correctly. For one thing, the sensor does not communicate with the TV, so all calibration settings are done manually. This is time consuming... which would be fine if the end result was a correctly calibrated display; yet, this is not the case. For some reason, the Spyder4 sensor and TV HD software fail to give accurate recommendation settings on brightness, and the colors end up over-saturated with red. Skin tone is way off and no calibration at all is done on a number of important settings (i.e. sharpness). Having run the software on multiple TV's (and wanted badly to succeed given my 100$ expenditure on the product), I can assure you that the best option is to look up optimal settings for your TV online. BTW: I know that TV's are automatically setup in retail stores to "pop" by oversaturating with blue. I also know that many TV's have the so-called "red-push" and that our eyes may not be used to seeing the additional redness of a properly calibrated TV... this is not what I'm talking about. The display ends up with color tones that are too dark, too red and without question not true to life.

Finally, the Spyder4Elite doesn't calibrate iphone's and ipads correctly at all, so don't even bother with that. I don't know if the fault is with the software, the sensor or the app, but you'll end up with some messed up colors on your mobile device.

In summation:

Buy the spyder4 Elite if you want a simple and effective way of calibrating your computer display.

DO NOT waste your money or time on the HD TV upgrade or the spyder Gallery mobile device calibration software.
 


I was simply stating that, with regard to an HDTV display, sharpness is not even touched upon in the software included. The coloromiter does more than color, btw. It also determines optimal settings such as contrast. So, I ask you: how do you separate contrast and sharpness? Are not the two intertwined?

The point is moot. I do not claim to be an expert at display calibration, and I am far from a professional in this regard. However, the fact remains:

DO NOT buy the spyder4 if you primarily plan to implement it on your TV, as it is ineffectual in doing so. Trust me when I say that this was not an easy conclusion for me to come to. I spent an extra 100$ on the HDTV upgrade, and there is always a part of me that wants to give it another whirl. But the software, hardware ... whatever ... simply fails at accurately calibrating said display types. If you do not believe me, then by all means purchase it and attempt to calibrate an HDTV with it. When you fail to create a true to life picture, you and I can share buyers remorse... OR, you can trust me and the plethora of other disenfranchised buyers (look around for reviews on this issue) and stay away from it.

I do not mean to be strident in my response, but the user who submitted the post to which I am responding gives me the impression that he/she just looked for something to nitpick about my comment as opposed to considering it in its entirety. My review was not negative in full... I think that the spyder4 elite software for calibrating computer monitors is excellent. But I wanted -- and will now reiterate -- to warn those who are thinking of buying the HDTV upgrade that it is a waste of money and, perhaps more frustrating, time, as it takes about 20 min to complete the calibration.

So, potential spyder4 HDTV software buyers beware: THIS PRODUCT IS NOT EFFECTIVE AT CALIBRATING HDTV's. I do not know exactly why it fails, but it does. If you are dubious about this, I suggest you read more reviews or else contact me so I can say I told you so.
 
Um, no they aren't related. A reading from the center of a sufficiently large block of one colour will see no difference between sharpness=0 and 100. Because sharpness is all about the colour around edges.

I do, however, agree that the HDTV package is probably useless. I used Spyder2Express with a LUT on my TV, but that's because the video is always from an HTPC, making it easy to calibrate. Spyder2Pro's ability to get you to move colour sliders would probably work fine on a TV, though as you said it needs a while to calibrate. 20 minutes isn't much though. I waste more time than that on random wikipedia browsing.
 

Thanks Christian that sounds great. Looking forward to it!
 
Status
Not open for further replies.