How to Calibrate My Samsung Tv Whites Have Blue Bleed
This is the second installment of a 2-part guest post by Jim Perkins, a professor at the Rochester Institute of Technology's medical illustration program. His first post detailed why it's a good idea to calibrate your computer monitor regularly. This next post walks us through the process and explains the mysterious settings known as gamma and white point.
Guest post by Jim Perkins
In my previous guest post, I encouraged all digital artists to invest in a monitor calibration system. Proper calibration guarantees that the image shown on screen matches the numerical color data saved in the digital file. Assuming your client uses calibrated printing equipment, there should be a nearly perfect match between the image you see on screen and the final printed piece.
If you have never calibrated your monitor, it's almost certainly out of whack. Maybe a lot. Maybe a little. There's really no way to know unless you generate an expensive prepress proof (e.g., a Kodak Approval, Fuji FinalProof, Creo Veris) and compare it to the on-screen image. Even a high quality monitor may not display colors accurately, especially as it ages. All monitors change over time, so calibration must be done on a regular basis. Most experts recommend doing it every few weeks to every few months.
The basics of monitor calibration are pretty simple. You hang a measuring device (colorimeter) in front of your monitor. The calibration software then displays a series of color swatches on screen. The colorimeter measures these swatches to see if the color displayed on screen matches what the color is supposed to look like. If there are discrepancies, the software can adjust the monitor to improve color accuracy.
In practice, however, calibration is a little bit trickier. First of all, you need to control some aspects of the monitor's environment to ensure proper calibration. Second, you must make some critical decisions about how you want the monitor to display color. As I'll discuss below, these decisions depends on whether you are creating art primarily for print, on-screen display (web, gaming), or broadcast (TV/film).
Calibration Conditions
Calibration should be done under the same conditions that you normally use the monitor. You don't want to calibrate under one set of conditions and use the monitor under different conditions. It won't look the same. For example, a monitor's display can change as it warms up. So be sure to turn the monitor on at least 30 minutes before calibrating so it warms up to normal operating temperature. This was more of a concern with old CRT monitors, but applies to flat panel LCDs as well.
Next, make sure you are using your monitor under moderate ambient lighting conditions. It's not necessary to work in the dark, but the monitor should be the strongest light source in your work area. Don't have strong lights shining directly on the screen, as this will affect the apparent brightness of the display and can introduce a color cast. Some calibration systems have ambient light sensors to compensate for this, but they're not perfect.
Some photo studios and prepress services go so far as to paint their walls and furniture a neutral 50% gray and use only daylight-balanced D50 fluorescent lights. The International Organization for Standardization (ISO – www.iso.org) publishes a set of guidelines called "Graphic Technology and Photography -- Viewing Conditions" (ISO 3664:2009) for photographers, artists, and web developers; and a stricter set of guidelines for photo imaging labs and prepress service bureaus called "Graphic Technology - Displays for Colour Proofing - Characteristics and Viewing Conditions" (ISO 12646:2008). This is probably overkill for most artists.
Choosing Your System's Gamma
When you connect the colorimeter and run the calibration software, it will ask you to select some important settings. The two most important settings are gamma and color temperature, both of which are fairly difficult concepts to understand.
Gamma is the relationship between the numerical value of a pixel in an image file and the brightness of that pixel when viewed on screen. The computer translates the numerical values in the image file into voltage that is sent to the monitor. This relationship is non-linear, meaning that a change in voltage does not translate into an equivalent change in brightness. For almost all TVs and computer monitors, a change in voltage results in a change in brightness raised to the 2.5 power. The gamma for these devices, therefore, is said to be 2.5.
Gamma correction is a way of compensating for this non-linear relationship between voltage and brightness. A combination of hardware and/or software can reduce the gamma to something closer to 1.0, i.e. a perfect linear relationship. This helps ensure that a change in pixel value in the digital file translates into a proportional change in brightness on screen.
Prior to calibrating a monitor, it is critical to tell the calibration software which gamma setting you wish to use. Historically, there has been a big difference in hardware gamma correction between Macs and PCs. For many years, this dictated the choice of gamma on these two platforms. However, as we'll see below, the choice now depends more on the type of work you do and not on the operating system.
Since its introduction in 1984, the Macintosh computer had built-in correction that brought the gamma of the system down to 1.8. Therefore, we say that the "system gamma" of Macs is 1.8. Apple chose this number for a very good reason. It turns out that printing devices have a type of gamma also. A 10% gray area of pixels in a digital file is printed as a series of tiny dots that cover 10% of the surface of the paper. In theory, this should produce the appearance of a 10% gray on paper, matching the value in the digital file. In practice, however, the ink or toner bleeds into the paper and spreads (called "dot gain"), creating a pattern of dots that covers more than 10% of the paper. This makes the printed image appear darker than it should, especially in the midtones. The Mac system gamma of 1.8 compensates for this phenomenon, making the image slightly lighter so it matches the digital file.
The original Mac was designed from the outset to be a graphic arts system. Its release coincided with the introduction of the Apple Laserwriter, the Linotype Linotronics imagesetter, and Aldus Pagemaker, the first page layout program. All of these components were tied together by the PostScript page description language, also released in 1984 by a fledgling company called Adobe. This launched the desktop publishing revolution of the mid-1980s and beyond. It was no coincidence that Apple chose a system gamma that was geared towards print output.
Windows PCs, on the other hand, have never had built-in gamma correction, although this is an option on some graphics cards. This reflects the fact that PCs were always targeted towards business and the mass consumer market rather than to graphics professionals. With no hardware correction, the Windows system gamma is about 2.2.
With the release of Mac OSX 10.6 (Snow Leopard) in 2009, Apple finally changed their default system gamma from 1.8 to 2.2. They did this, of course, to ensure that video games and web images looked the same on Mac and PC systems. In doing so, however, they abandoned their traditional base of support among graphics professionals.
The choice of gamma settings, therefore, is no longer dictated by the computer platform or operating system. Instead, when calibrating your monitor, you can choose a gamma setting that is best suited to the type of work you normally do. This will override the built-in settings of the system.
If you create mostly images that will be viewed on screen – for the web, PowerPoint, video games, etc. – set your gamma to 2.2. This will help ensure that your images look consistent across the widest range of computers used in business and the mass consumer market.
On the other hand, if you still create most of your work for print (as I do), stick with 1.8. Not only is this setting more compatible with high-end printing system, it also produces noticeably lighter images on screen. This helps you see detail in shadows, something that is critical when creating and editing digital images.
Color Temperature
The other important setting when calibrating a monitor is the color temperature, sometimes called the white point because it affects the appearance of white on screen.
Several scientists in the late 1800s noted that cold, black objects radiate different colors of light as they are heated to high temperatures. This led to the development of the tungsten filament light bulb. In 1901, Max Planck proposed the idea of an ideal black body, a hypothetical object that reflects absolutely no light but radiates different wavelengths of light with increasing temperature. Although the ideal black body only exists in theory, Planck was able to determine mathematically the wavelengths (i.e., colors) of light that would be emitted at different temperatures. At relatively low temperatures, the black body would glow red, then orange, then yellow. At very high temperatures it would radiate a bluish light.
This is quite different from our emotional associations with different colors. We think of blue as being cool, while yellow, orange and red are warm colors. But in the world of physics, it's just the opposite. You can confirm this by looking at a gas flame. The center of the flame – the hottest part – glows blue. The cooler outer edge of the flame glows yellow and orange.
Physicists express the temperature of the ideal black body in degrees Kelvin (°K). This is just a different scale for measuring temperature, like Celsius and Fahrenheit. The Kelvin scale is noteworthy because zero degrees on the Kelvin scale is known as Absolute Zero – the temperature at which all molecular motion stops (equal to -459.67° on the Fahrenheit scale)
So what does this have to do with monitor calibration? There is no such thing as pure white. Every light source has a slight hue or color cast to it. For any given light source, we can match it up to a temperature on the Kelvin scale that emits the same color of light. Below is a list of lighting conditions and their corresponding color temperatures:
*If you've ever driven by someone's house at night when their TV is on, you can see the blue glow in the windows.
Note that the color temperature is not a measure of the actual temperature of each object or condition. Clearly a television is not hotter than a candle flame. Instead, the color temperature is a measure of the color hue of white light under those conditions, corresponding to the color that would be emitted by a hypothetical black body at that temperature.
Any white objects that appear on your computer screen will have one of these color casts. You probably don't notice it because you are accustomed to thinking of a blank page on screen as being "pure" white. However, if you change the color temperature of your monitor, you will see a dramatic difference and the color cast will become obvious. On the Mac, go to the Monitor controls under System Preferences. Select the Color tab and click Calibrate. Here you have the option of changing both the gamma setting and color temperature to see how they affect your screen. However, I recommend you DO NOT save any of these changes. You'll have a chance to choose gamma and color temperature later when you run the calibration software that came with your colorimeter.
So which color temperature setting is best? As with the gamma setting, it depends on what kind of work you do. For many years, the standard color temperature setting for graphic arts work was 5000°K (also known as D50). This is closest to neutral white and simulates common lighting conditions for reading printed materials. Therefore, I feel this is the ideal color temperature to select if you do mostly work for print.
If you create mostly web graphics or other images viewed on screen, choose 6500°K (also known as D65). This is the default color temperature of the sRGB color space and is used by mass market computer monitors, most of which are uncalibrated. It also displays images with a bluish color cast that is familiar to consumers who watch lots of TV (e.g., most Americans).
Some experts argue that all computers should switch over to 6500°K (D65), even if they are used mostly for print work. This has been a recent trend in the graphic arts. They feel that a monitor calibrated to 5000°K (D50) is too dull and yellow. Most users prefer to work at D65, which appears brighter and bluer, like a TV.
I disagree with this logic. It's true that the screen image will appear noticeably dull and yellow if you switch from D65 to D50. However, after working on the computer for a few minutes, you won't even notice it. If you work at D50 for a while and then switch back to D65, you'll be shocked at how blue and gaudy the screen appears. More importantly, the D50 standard does a better job of simulating what images look like when printed on paper under normal viewing conditions. This is why the D50 standard was adopted by the graphic arts industry in the first place. Switching all monitors to D65, even for print work, seems like a one-size-fits-all approach, pandering to the masses who work on cheap, uncalibrated systems. As someone who is 6'2" and 300 lbs., I chuckle at the notion of anything that claims to be "one-size-fits-all."
Bringing It All Together
In summary, the process of monitor calibration involves the following steps:
1. Locate your computer in the proper environment with moderate ambient lighting and no direct light sources shining on the monitor.
2. Turn the monitor on at least 30 minutes before calibrating.
3. Plug the colorimeter into the computer and hang it in front of the screen (follow the directions that come with the device).
4. Launch the calibration software that came with the device.
5. When prompted, select the values for gamma and color temperature. If you do mostly print work, I recommend gamma 1.8 and 5000°K (D50). If you create mostly web graphics, game assets, or other images viewed on screen, choose gamma 2.2 and 6500°K (D65).
6. Follow the instructions to have the colorimeter measure your monitor's colors and make any necessary adjustments.
In my previous post, I provided a list of monitor calibration devices currently on the market. Make sure you buy one that lets you select the gamma and color temperature settings. Some of the cheaper models limit you to a gamma of 2.2 and 6500°K (D65). The Datacolor Spyder3Express ($89) is one such model, as was the original Pantone Huey (no longer available). I can't recommend either model since you would not have the option of selecting 1.8 gamma or 5000°K (D50) color temperature.
The Datacolor Spyder3Elite and X-Rite i1Display Pro, both of which are over $200, provide the most control over color settings. They offer more gamma choices and the ability to choose a custom color temperature. While this flexibility is appealing, it can be dangerous if you don't know what you're doing. Only an expert in color management would need to choose a custom color temperature other than D50 or D65.
The Datacolor Spyder3Pro, X-Rite/Pantone ColorMunki Display, and the newer Pantone Huey Pro provide a choice of three or four gamma settings (including 1.8 and 2.2) as well as a choice of color temperature settings (including D50 and D65). These three models are the ideal choices for most digital artists and photographers. All three are in the $100-200 range, a small price to pay for accurate color on screen.
Jim Perkins is a Professor in the Medical Illustration program at Rochester Institute of Technology, where he teaches courses in human gross anatomy, scientific visualization, and computer graphics. He is also a practicing illustrator, creating artwork for several best-selling medical textbooks, mostly in the areas of pathology and physiology. For 20 years, he has been the sole illustrator of the Robbins and Cotran series of pathology texts. He is also part of a team of illustrators who carry on the work of the late Dr. Frank H. Netter, considered by many to be the greatest medical artist of the 20th Century. To see examples of Jim's work, visit the following links:
RIT faculty page
Netter art
The views expressed are those of the author(s) and are not necessarily those of Scientific American.
Source: https://blogs.scientificamerican.com/symbiartic/how-to-calibrate-your-monitor/
0 Response to "How to Calibrate My Samsung Tv Whites Have Blue Bleed"
Post a Comment