Colour Temp Setting

Status
Not open for further replies.

Richard Flack

New Member
Joined
Mar 21, 2016
Messages
19
Lightroom Experience
Beginner
Lightroom Version
Operating System
  1. Windows 10
Beginner question, Im afraid. Hitherto I've used the temp slider to tweak the colour temp of an imagem and not really paid much attention to the numeric value.
But today Ive noticed something puzzling. I took the same picture ( a dinner party) twice, once with flash and once without. Both images looked "ok" out of the camera, but the flash image was a little too cold and the non flash (halogen ceiling flood lights) image a little too warm.
What really surprised me though is that the flash image had a colour temp of 5500 (which I tweaked to 6000) and the natural light image had a temp of 3100 which I tweaked to 2900.

I realise that I don't understand the technicalities but it seems odd to me that the two images need settings so far apart. If, for example., I change the flash shot to a temp of 2900 ot goes totally blue and if I change the non flash to 6000 it goes totally yellow / orange.

Can someone help explain this, or point me to a tutorial.

Both images were taken with Sony A6000 RAW files, "Auto" settings
Kit lens f/3.5
Flash: ISO 800, 1/60th
Non ISO 3200 (1/10)
 
Flash light has the same color as outdoor light. Indoor light is much warmer. So a flash image and the same (indoor) shot without flash are very different when it comes to the correct white balance.
 
Thank you, I understand what you say ... but .... Im still confused!

See images at link below.
01 & 02 are more or less as taken, with & without flash.
01 - Flash - has Temp 5750; 02 NoFlash has temp 3150. Flash looks cooler than NoFlash, but has higher temp setting.
This seems backwards to me

03 is 01 (Flash) with temp set to 3150 (NoFlash) and 04 is the converse.
Both are totally distorted.

https://gyazo.com/502b3a8e967bb1c7452cd36b5657386a
 
This seems backwards to me
It is backward. Here's why:

"Cool" and "warm" colours are a psychological thing. Red is described as "warm"; blue is described as "cool". The "colour temperature" is based on what colour light is emitted as you heat something up. At relatively low temperature, you get light that peaks in the red end of the spectrum. At higher temperature, the thing you're heating emits mainly blue light. So increasing the colour temperature in LR moves the image toward blue.
 
Wow. I sort of knew that way in the back of my mind but never connected it to this issue. So that explains that bit (ie what happens to either image when I change the temp). Thanks for that.

Im still not clear why the very large difference in temp numerically, as recorded, between the flash and no flash when visually they are not all that far apart (compared to the effect of 'swapping' the temp settings).
 
So increasing the colour temperature in LR moves the image toward blue.
Except that it doesn't. Increasing the colour temperature slider in LR moves the image toward yellow, not blue - the slider is even colour-coded to indicate this. This is where the confusion comes from, I think.

Try thinking of it this way: by adjusting the colour temperature slider you are telling Lightroom "this is the colour of the light that was present when I took this photo". The colour of the light is reflected from all of the objects in your photo and will therefore give everything a "colour cast". When you set the colour temperature slider to something high (which means blue-ish), LR says "OK, a blue light was used for this photo, so to make things look natural I need to compensate by adding more yellow". So increasing the colour temperature slider makes the image more yellow. Conversely, if you set the colour temperature slider to something low (which means yellow-ish, such as tungsten room lighting), LR thinks "a yellow light was used here, so to compensate I need to make the image more blue".

So the colour key below the colour slider in LR indicates how adjustments will affect your photo, and NOT the colour of the available light that was present when the photo was taken.
 
Think about color temperature this way. "Red Hot"is cooler than "White Hot". The temperature of the Sun is 5,778 K. We see this as natural light color (White Light). The tungsten filament lamp temperature is about 2800–3300 K. We see this as the the warm orange glow of an incandescent light. The color temperature of white sodium lamps, at 2600 K to 2800 K, closely resembles incandescent lighting. These color temperatures correspond to the settings in Lightroom.
 
Wow. I sort of knew that way in the back of my mind but never connected it to this issue. So that explains that bit (ie what happens to either image when I change the temp). Thanks for that.

Im still not clear why the very large difference in temp numerically, as recorded, between the flash and no flash when visually they are not all that far apart (compared to the effect of 'swapping' the temp settings).
The human (our built-in) auto-white balance capability is truly astonishing. While not exactly answering your question, it is useful to understand the incredible range of lighting conditions that we automatically correct and so perceive the world around us with amazingly consistent colours.

As an example, I was on a building site, where the roof was being replaced. I was standing in a space that was lit by summer daylight coming through the new translucent green waterproof layer that goes under the new tiles (shingles). This was so extreme that I was aware of a slight colour imbalance (other people had a very slightly sickly green hue) , but it wasn't that bad at all. However, there were small gaps in the walls which allowed me to see daylight, and this was now rose pink, instead of the white of normal cloud. Our built-in auto-white balance, it happens so fast and so effectively, we are rarely aware of it. Cameras struggle to match what we can do and so regularly get things a little (or a lot) wrong.

Some cameras allow the use of a product like an expo-disk. This will set the exact correct colour balance for the scene you are in, though I have found that what is correct can still look odd. Using a colour reference card is the best way to get colours correct.
 
If a monitor offers an sRGB mode, setting it to this mode should present no problems.
Most newer monitors have a gamut much larger than sRGB. Apple monitors use DCI-P3 out of the box. My BenQ also can be tuned to DCI-P3 or AdobeRGB. sRGB is a very old standard developed by HP and Microsoft when CRT displays were the only option. Many apps and browsers are not color managed so forcing all the pixels in a the smallest color envelop provides an acceptable uniform viewing experience over a wide variety of monitor displays and viewing apps.
 
A color temperature of 6500 K is standard for ordinary PC use and for the sRGB standard. Most LCD monitors offer a setting of 6500 K among their color temperature options. If a monitor offers an sRGB mode, setting it to this mode should present no problems.
The color temperature of the monitor has nothing to do with the color temperature of the white balance of individual images. This thread is about the latter.
 
Ok. Those are Kelvin values - colour temperatures. You're shooting RAW images, so you have access to fine adjustments
Tungsten light is warm and flash light is cool.
Generally, you'll make WB changes in your camera whilst shooting - and yes, as noted above, you can choose Auto WB and get mostly good results if you're shooting in changing conditions.
 

Attachments

  • images.png
    images.png
    8.4 KB · Views: 204
Ok. Those are Kelvin values - colour temperatures. You're shooting RAW images
Generally, you'll make WB changes in your camera whilst shooting - and yes, as noted above, you can choose Auto WB and get mostly good results if you're shooting in changing conditions.
A RAW file in the camera does not have WB, Color Temperature, tone, or Noise Reduction. This is one of the reasons they are called RAW. The Raw images simply contain photo site values for the photo filter array. These have to be converted to RGB pixels either by the camera processor which uses the processor settings and output as a JPEG file in the Camera or demosaic’d into an RGB pixel image using settings either applied automatically in Lightroom on import on conversion to RGB or adjusted in the Develop module.

The best in camera settings for shooting RAW is to pay attention to focus and exposure. Set the WB to Auto knowing that you can change it properly in Lightroom Develop and the camera back JPEG and initial JPG thumbnail will look reasonable on first review.


Sent from my iPad using Tapatalk
 
Yes, they do. It's possible I don't understand what you mean here but file format doesn't delete access to in-camera settings.
Some cameras let the user make changes to tone and to NR. I use a 6DII and 7DII and both have advanced options for those settings.
 
Status
Not open for further replies.
Back
Top