Different representations of colours between software

Status
Not open for further replies.

pHneutral

New Member
Joined
Jun 12, 2017
Messages
8
Lightroom Experience
Power User
Lightroom Version
Hi there!

I hope someone can help me out, because I'm a bit puzzled. I got two colour calibrated pc's (Datacolor Spyder), and since recently I notice that between different software, photos are rendered with different saturation/contrast levels.

So I noticed, Lightroom, Photoshop, and Chrome, seem to display photos pretty much indistinguishable similar. But then Edge and the default Windows Photos app and Faststone, display the photos all a bit more saturated/contrasty.

This happens on two machines, both Windows 10. Both calibrated, and even same type monitors. And I do export photos with the sRGB profile. Those JPG's are displayed in Photoshop quite accurately the same as the RAW's in Lightroom, but then different in other APPS, besides Chrome....

I'm quite lost with this, and don't know what to trust anymore..
Thanks in advance for any suggestions!

Dennis
 
Last edited:
Are they calibrated to sRGB, or to a wide gamut? I found that different software was very unpredictable when I had my wide gamut monitor calibrated to be wide, and find things work much more consistently in sRGB. That's clearly not a good answer for many people, but since I work mostly for web usage (or newsprint which is even a more narrow gamut) it works for me.

There should be a way to make (almost) everything work consistently, but I simply gave up after experimenting, even with supposedly color managed applications.
 
Ah so you also had your fair share of busting your brains (and eyes), as I read.
Thanks for thinking with me Ferguson. It's calibrated to (99%) sRGB. The monitors are not wide gamut...
 
I don't know about Edge, but as far as I know Windows Photos and Faststone are not color managed. That means that it doesn't matter how your display is calibrated, because these programs do not use the display profile to correct the colors. They simply send the RGB-values to the display 'as is'. And that is exactly why you see that difference. Your display may be close to sRGB, it is probably not exactly sRGB.
 
From what I can gather, Edge isn't colour-managed either.
 
I had a feeling this was the case.
The tricky thing is, that it can happen that clients of me watch my work in non colour managed software...So I end up during my post processing sort of trying to constantly export it and watch it in different software. Going back and forth, so the result won't look too saturated in non colour managed software..
I wonder how others deal with this.. It sort of takes confidence away in editing.
 
I don't know about Edge, but as far as I know Windows Photos and Faststone are not color managed. That means that it doesn't matter how your display is calibrated, because these programs do not use the display profile to correct the colors. They simply send the RGB-values to the display 'as is'. And that is exactly why you see that difference. Your display may be close to sRGB, it is probably not exactly sRGB.
Now wait... am I confused?

I thought color managed had to do with displaying images in different color spaces. Thus if I had a AdobeRGB and sRGB extract of the same image, if I displayed them in Firefox (which is color managed) and in Photos, they would display differently, but if I displayed in Firefox and in Photoshop (both of which are) they should look the same.

I thought that the calibrated profile, on the other hand, was to ensure that the same color output request to the driver (color managed or not) appeared approximately the same way on two monitors (assuming you used all the same targets for calibration).

More specifically, I did not think that a color managed application itself accessed the profiles created for a display, but rather the display driver was doing so? (I realize you can select a monitor profile for soft proofing while on that monitor, but there's a lot you can select that is not meaningful)

And more importantly to the current discussion, if the displays are calibrated for sRGB and the images are all sRGB, should it matter if the applications are color managed?
 
Color management is used to convert colors between color spaces, so indeed between sRGB and AdobeRGB, for example. However, your monitor also has a color space, so color managed applications will convert the colors from the color space of the image (sRGB, AdobeRGB, ProPhotoRGB) to the color space of the monitor (defined in the monitor profile). Non-color managed applications don't do this.

Displays aren't 'calibrated for sRGB'. They are calibrated for their native color space, and a profile is made to describe that native color space. This may be close to sRGB for cheaper monitors, but it is unlikely that it is exactly sRGB.
 
@ Ferguson, as far as I remember from a time when I really got into this stuff, I also understood that the display profile is sort of the final universal "translation" of colours to the display. So that applications don't "use" the display profile one-on-one. As I've read in the past, whether you're working with ProPhoto (Lightroom), or AdobeRGB (within Photoshop for instance), when your system is calibrated, it still gets translated to a "universal" rendering of colours, that is within (a certain percentage of) sRGB, if your monitor isn't wide gamut. So I'm still confused with what's happening here, since I'd like to post process in a way in Lightroom and Photoshop, so that the exported JPG (with sRGB) is at least on my machine displayed similar in Edge or the Photos app. That way I trust there's a higher chance my photos are displayed reasonably similar on other people's systems.
I feel that these apps, Photos, Faststone, Edge, simply ignore the sRGB color profile that's attached to the exported JPG? Since when I open the JPG back into Photoshop, it's displayed correctly as the RAW file (with still ProPhoto) in Lightroom...
 
You are both confusing two things; the color profile of the image, and the color profile of the monitor. A fully color managed application will use both. It will use the embedded color profile of the image to understand what the colors of the image are, and it will use the monitor profile to convert the colors from the image to the color space of the monitor.

A non-color managed application will usually ignore both. So if you open a ProPhotoRGB image in such an application it will display very muted colors because it doesn't recognize the super wide gamut color space of the image. And if you use it on a wide gamut monitor it will oversaturate sRGB images because it doesn't use the monitor profile to convert the sRGB color values to the color values of the wide gamut color space of the monitor.

Because most consumer monitors are close to sRGB, using sRGB is always your best bet in this case, but it is also as far as you can go. A non-color managed application running on a computer with a consumer grade monitor will do nothing with the color values of the image. But because the image is sRGB and the monitor is at least close to sRGB, that will give a reasonable match. You can't do more than that and you can't expect more than that.
 
(This was written before Johan's last posting above, sorry)

Displays aren't 'calibrated for sRGB'. They are calibrated for their native color space, and a profile is made to describe that native color space. This may be close to sRGB for cheaper monitors, but it is unlikely that it is exactly sRGB.

Depends on the display. One NEC I have you explicitly pick a gamut for which to calibrate the display (native is also a choice).

However, your monitor also has a color space, so color managed applications will convert the colors from the color space of the image (sRGB, AdobeRGB, ProPhotoRGB) to the color space of the monitor (defined in the monitor profile). Non-color managed applications don't do this.

So here is where I get confused. There are profiles, and there are LUT's. I have hardware LUT's in my monitor, but as I understood it, the LUT's basically handle the correction to get the monitor into a specific color space (or as close as it can get).

So let's say I calibrate my monitor to sRGB (it's a specific calibration selection). My understanding is that the LUT's are then set up so that if a non-color-managed application sends 0x50, 0x50, 0x50, that the LUT's translate it as needed to result in the selected/native gamut of "grey". So the actual LCD settings may be 0x48, 0x52, 0x50, but the resulting dot pattern is approximately neutral grey.

So if a color managed application sent a AdobeRGB equal neutral grey (let's say that is 0x30, 0x30, 0x30 -- I'm making numbers up), then the color managed application uses the ICC profile to translate this to the monitor's profile, and sends 0x50, 0x50, 0x50 and the LUT's again change that to 0x48, 0x52, 0x50.

Is that correct? And while I have hardware LUT's, I think this works the same if the LUT's are loaded in the video card.

So in my case with a sRGB monitor (and I mean one calibrated to sRGB, not to "native"), I would expect it's ICC profile to be sRGB, i.e. for sending sRGB through it, it is unity and does nothing (the LUT's are doing the changes)>

Am I still on track?

So to the OP's question, if sending an sRGB image to two (mostly) sRGB monitors are resulting in different appearances in non-color-managed apps, but not in color managed applications, that would imply the monitor's (one or both) are substantially NOT calibrated to sRGB, and the lack of use of the monitor profile for that pre-LUT translation is the issue.

Which is what you said of course. I'm just trying to pin down why -- it's due to the monitors not really being sRGB (and in this case, not being the same-not-RGB either)?
 
After calibrating, the Datacolor software indicated it could render 99,9% sRGB, since these two monitors I have, are advertised specificaly as such... When calibrating, I used the advanced option. Which involved using the RGB sliders of the monitor to get the monitor as close to 6500 kelvin as possible, and then for the actual calibration process to make the colour profile, using the "White point" as native (since I could get my monitor with the sliders very close to 6500 kelvin).

But why would the Windows Photo app and Edge ignore sRGB? Strange... It's the international standard.
 
Your NEC is probably a wide gamut monitor. Some wide gamut monitors can indeed be calibrated for a specific standard color space, such as sRGB. We are talking about consumer monitors however. They do not have this option.

What confuses you is that you don't realize that what is usually called 'calibrating a monitor' is actually two steps: calibration and profiling. Calibration is used to make sure that the monitor is displaying its colors in the best possible way, and that grey is really neutral grey. That is where the LUT comes into play and that is used by the driver. After you've calibrated your monitor, a profile is made. That profile describes your monitor color space, just like the AdobeRGB profile describes the AdobeRGB color space. And that profile is used by color managed applications to translate the colors between the color space of the image and the color space of the monitor. That is what we call 'color management'.
 
After calibrating, the Datacolor software indicated it could render 99,9% sRGB, since these two monitors I have, are advertised specificaly as such... When calibrating, I used the advanced option. Which involved using the RGB sliders of the monitor to get the monitor as close to 6500 kelvin as possible, and then for the actual calibration process to make the colour profile, using the "White point" as native (since I could get my monitor with the sliders very close to 6500 kelvin).

But why would the Windows Photo app and Edge ignore sRGB? Strange... It's the international standard.

They don't 'ignore sRGB', they ignore any color profile.
 
After calibrating, the Datacolor software indicated it could render 99,9% sRGB

I don't know this specific software, but that probably means that your monitor can display 99.9% of the sRGB color space. It does not mean that the color space of your monitor is 99.9% identical to sRGB.
 
In fact, you could say that they assume that everything is sRGB!

But that's my point with regard to the OP's question. If you have a monitor that is (approximately) sRGB, and an image that is sRGB, and you display it on a color managed application and a non-color managed application on that monitor, are any differences in it the result of the difference in the monitor's actual gamut and sRGB?
 
But that's my point with regard to the OP's question. If you have a monitor that is (approximately) sRGB, and an image that is sRGB, and you display it on a color managed application and a non-color managed application on that monitor, are any differences in it the result of the difference in the monitor's actual gamut and sRGB?

Correct. If your monitor was exactly sRGB, you should see no difference.
 
Could someone perhaps try, as a reference for me, to see if there's (a significant) increase in saturation between exported JPG's (with sRGB), compared to Lightroom, in the default photos app?
I still somehow feel there's something "off" more than was the case in the past...
Thanks :)
 
I'm not sure what you mean by "default photos app". I shoot raw, so do you mean Nikon's (in my case) processing software?

Different raw conversion software behaves differently of course, in terms of all sorts of things - saturation, actual colors, etc. Lightroom by default with the default profiles is pretty flat and low contrast.
 
Photos on the left, LR on the right. Although the snipping tool doesn't accurately show the colours, you should be able to see the increased saturation in the photos app.

Capture1.JPG
 
Photos on the left, LR on the right. Although the snipping tool doesn't accurately show the colours, you should be able to see the increased saturation in the photos app.

View attachment 9485

So let me guess: you use a wide gamut monitor.
 
Photos on the left, LR on the right. Although the snipping tool doesn't accurately show the colours, you should be able to see the increased saturation in the photos app.

View attachment 9485
Thanks so much! Pfew, so at least I know it's a common thing. Is the "ehance photos" option enabled in your photos app? Though it still oversaturates when the option is off.

Hm, this is quite an annoying thing actually, that other software doesn't adhere to a standard like Lightroom and Photoshop do.....
 
I'm not sure what you mean by "default photos app". I shoot raw, so do you mean Nikon's (in my case) processing software?

Different raw conversion software behaves differently of course, in terms of all sorts of things - saturation, actual colors, etc. Lightroom by default with the default profiles is pretty flat and low contrast.

I meant like what Jim Wilde did. But once again thanks for your willingness to help guys! :)
 
Status
Not open for further replies.
Back
Top