The issueīut, if the scene does not have an absolutely neutral component – if there is a bluish somewhat-gray item in the scene but is not truly neutral gray (like the snow scene below) – then the image processor in your camera will dutifully and obediently change that bluish color to neutral gray, and shift all the other colors in the scene in the same direction on the color wheel. The corrected values will then actually balance the colors in the image and produce an image that looks “real”. This is all well and good IF that cluster of pixels in the captured scene actually is, in reality, neutral (gray) in color. The AWB mandate then forces those colors to become absolutely neutral value while twisting all other colors in the scene in a similar manner. The camera then examines the light reflecting from objects in the scene and locks onto the cluster of pixels whose RGB values are closest to equal (regardless of how dissimilar). A pretty wild assumption considering that there are over 16,000,000 colors in the visible spectrum. In order for the camera’s AWB algorithm to deliver accurate color, it must assume that there exists a detectable and absolutely neutral gray component in the scene. Trusting that the camera’s AWB will correctly diagnose light and set the proper color interpretation is a flawed and risky assumption fraught with problems.įirst, in the language of RGB color, equal values of red, green, and blue (like red 128, green 128, and blue 128) light produce an absolutely neutral gray color. Unfortunately, this is not true for (digital or film) cameras. No matter when you see these memory color items, your brain registers these colors and in a sense, overrides the actual color of the light. Your brain compensates for almost every color of light, delivering a believable impression of what you’ve come to think of as reality. Whether under candlelight or sunlight, fluorescent or tungsten, sunset or noonday, a white sheet of paper will always appear white because your brain retains the associative reference. These include grass (green), sky (blue), paper (white), orange (orange), etc. Memory colors are logged into our brains. This is why you must balance color in Photoshop and Lightroom by referencing known neutral gray elements in the photo to known values. Its sensors have no such recollection and are not so forgiving. If we associate a color with an object often enough, we establish a link between the two. White paper viewed under color light still appears white because of what we call memory colors, a cognitive database of repeated experience. It perceives white in a very assumptive manner. The human eye is very forgiving in this respect. The ColorChecker includes a row of neutral gray patches, none of them being pure white.
![tonal balance control sucks tonal balance control sucks](https://i0.wp.com/digital-photography-school.com/wp-content/uploads/2018/07/White-Balance-Alaska.jpg)
#TONAL BALANCE CONTROL SUCKS PATCH#
The Gray Balance tools in Photoshop and Lightroom will neutralize whatever color you click on, so always pick a gray patch rather than a white one. Just enough to throw the color balance of the photo way off center if used as a reference (try it and you’ll see). What we perceive as white in a photograph more often than not contains trace amounts of red, green, or blue.
![tonal balance control sucks tonal balance control sucks](https://audiosex.pro/data/avatars/l/66/66226.jpg)
![tonal balance control sucks tonal balance control sucks](http://lh6.ggpht.com/_RpfkRsMA1mY/TUQt1J6lPVI/AAAAAAAAAQQ/olJW3V08kcg/VariableLoudness.jpg)
We don’t use white cards simply because you can’t measure data that doesn’t exist. Photographic gray cards are absolutely color-neutral. Neutral gray colors (yes, gray is a color) are all composed of equal measurable parts of each RGB color, while pure white contains no measurable color at all.
#TONAL BALANCE CONTROL SUCKS FULL#
This is the bottom row of patches from the full ColorChecker chart published (now) by X-Rite.