What is 100% !? (& Apple monitor advice)

Status
Not open for further replies.

David Gordon

Active Member
Premium Classic Member
Premium Cloud Member
Joined
Jul 8, 2017
Messages
198
Location
Scotland
Lightroom Experience
Advanced
Lightroom Version
Classic
I have a new MacBook Pro 14" and it's really sharp. The screen is 254 ppi so I guess that's why.

I've been down a few rabbit holes researching Apple screens and am now questioning if I'm always viewing images at 100% when I think I am or if the screen resolution messes things up.

For example I have an old (23' I think) display which I know runs at around 98 ppi. With that connected to my MacBook Pro will Lightroom 'know' and display the correct 100% pixel for pixel view? I seen to remember a discussion suggesting some Apple monitor combinations might need LrC set to 200% to view at 100%. So I'm slightly confused!

My old 98 ppi screen looks quite rough so I'd like something better. For a 27" screen Apple's Retna is around 218 ppi. Those who may have an Apple Studio Display, is it as sharp and detailed as I'm expecting. Or should I get an EIZO for the same or less (but with better colour!) and run it at around 110 ppi?

Thanks
 
Lightroom will know. There is an issue with high resolution screens if you set the screen resolution (in MacOS System Settings) lower than the native resolution (because otherwise menus and other interface items will be tiny). That does not apply to this old screen, I assume.
 
Those who may have an Apple Studio Display, is it as sharp and detailed as I'm expecting.
Impossible to say, as I don't know what you would be expecting. All I can say is that it is every bit as sharp and detailed as I was expecting. In all probability, it will be a touch less sharp than the MBP display (218 ppi versus 254 ppi), but I have the two displays running side by side and I find it difficult to see a difference. I usually keep the MBP lid closed, as I much prefer the size of the Studio display.
 
For example I have an old (23' I think) display which I know runs at around 98 ppi. With that connected to my MacBook Pro will Lightroom 'know' and display the correct 100% pixel for pixel view? I seen to remember a discussion suggesting some Apple monitor combinations might need LrC set to 200% to view at 100%.

This all boils down to whether macOS renders to a display using “Retina” resolution or not. And also, what 100% means in photo apps hasn’t changed, but that means as display resolutions have increased and the size of one pixel has decreased dramatically, 100% has become a less useful/reliable way to measure “actual size.”

If the pixel dimensions of a display are high enough, macOS decides that it needs to default to “Retina” mode (called “HiDPI” in Windows) to keep everything readable. What Apple calls “Retina” display is when it starts doubling the pixel density. So, if you connect a 4K display (3840 x 2160 px), macOS might default to 1920 x 1080 px (Retina), which means the UI is as large and readable as it is at 1920 x 1080 px, but UI elements are double the resolution (2x), so for example, text looks sharper. This also means that 100% on a Retina/HiDPI display generally looks about half as large (along one dimension) or 1/4 as large (in area) as it does on an older 1x display. This makes sense because the pixels are half as large as they are on older 1x displays. It is on these Retina/HiDPI displays where, in image editing apps (not just from Adobe), 200% shows an image as large as 100% on an older 1x display.

In macOS System Settings, Displays, the “Default” pixel dimensions are typically exactly half that of the panel, for exact 2x scaling which is fast. You can select another resolution (I pick a larger one to fit more Lightroom Classic controls on screen), which it warns might be a little slower due to calculating the non-exact scaling, but I’ve never noticed.
 
It can be a bit more complicated, because MacOS usually defaults to something that is more than 50% these days, so perhaps 2560 pixels wide on a 4K display (because the physical size of 4K displays is larger these days). MacOS will then tell apps to use double the 2560 pixels (so 5120 pixels), so it can downscale that to the 3840 pixels of the display. That means a crisp view. Unfortunately, apps like Lightroom Classic and Photoshop will also use this for images, not just for interface elements. So “100% image display” will mean the app places 5120 pixels on the screen, but because that is downscaled to 3840 real pixels, you are not really viewing the image at 100% (one image pixel = one monitor pixel), but at 75%.
 
Impossible to say, as I don't know what you would be expecting. All I can say is that it is every bit as sharp and detailed as I was expecting.
That's the answer I was hoping for! As I understand the difference in ppi between the MacBook Pro and Studio Display is because you'll be viewing the display at a greater distance. I believe iPhones have an even greater ppi for this reason.

Supplementary question @Jim Wilde : Standard glossy or fancy matt? I was looking towards the matt version as that's what I'm used to. But if the standard Display is the same as the MBP that may work for me. I'm currently using the MBP in my usual office under my editing conditions and don't see any reflections on the screen. There's also the Internet FUD that the matt version loses some sharpness. Which is the reason to look at the Apple Display in the first place.

This all boils down to whether macOS renders to a display using “Retina” resolution or not. And also, what 100% means in photo apps hasn’t changed, but that means as display resolutions have increased and the size of one pixel has decreased dramatically, 100% has become a less useful/reliable way to measure “actual size.”
Yes, the whole thing about the way Apple scales is confusing. It would appear not as good as Windows! So I'm keeping away from 4k displays as I know they'll end up at 1920 and I feel 2560 is about the sweet spot for a 27" screen. The Apple Studio will give me that at 218ppi otherwise a 2k display such as an EIZO 27" will scale 'correctly' to 110ppi or thereabouts.
 
It can be a bit more complicated, because MacOS usually defaults to something that is more than 50% these days, so perhaps 2560 pixels wide on a 4K display (because the physical size of 4K displays is larger these days). MacOS will then tell apps to use double the 2560 pixels (so 5120 pixels), so it can downscale that to the 3840 pixels of the display. That means a crisp view. Unfortunately, apps like Lightroom Classic and Photoshop will also use this for images, not just for interface elements. So “100% image display” will mean the app places 5120 pixels on the screen, but because that is downscaled to 3840 real pixels, you are not really viewing the image at 100% (one image pixel = one monitor pixel), but at 75%.
Ah!, interesting, thanks. So that's another reason for me to stay clear of 4k.
 
Supplementary question @Jim Wilde : Standard glossy or fancy matt? I was looking towards the matt version as that's what I'm used to. But if the standard Display is the same as the MBP that may work for me. I'm currently using the MBP in my usual office under my editing conditions and don't see any reflections on the screen. There's also the Internet FUD that the matt version loses some sharpness. Which is the reason to look at the Apple Display in the first place.
I went with the standard display, though I don't tend to think of it as "glossy". I found an Apple Store that had both screen types next to each other, so was able to compare them at the same time. I didn't think that the anti-reflective Nano-Texture display was very much better that the anti-reflective Standard display, certainly not enough in my mind to justify the extra cost, especially when considering the more finicky cleaning that the Nano-Texture screen requires. It's not a decision that I regret, and certainly I see no difference between the MBP display and the Studio display.
 
I see no difference between the MBP display and the Studio display.
Thanks again @Jim Wilde, very useful. I did once buy a MacBook Pro with a glossy screen and it really didn't work for me. I swopped it for a matt version. Back in the day when there was an option! But my last couple have been standard and clearly they're different.
 
Ah!, interesting, thanks. So that's another reason for me to stay clear of 4k.

That scaling issue is actually a huge reason why Apple has favored 5K over 4K on their 27" desktop displays. 5K resolution divided by 2 results in the same UI resolution as the 2560 x 1440 px resolution @1x that so many people find comfortable on 27" displays, so in that way 5K can be a more practical Retina/HiDPI 27" resolution than 4K.
 
I also, like Jim, have a Studio Display and went Standard, extremely happy with it. I have an Eizo next to it a secondary display (running at 1920 x 1000) and can see the difference, but it's great for a secondary.
 
I also, like Jim, have a Studio Display and went Standard, extremely happy with it. I have an Eizo next to it a secondary display (running at 1920 x 1000) and can see the difference, but it's great for a secondary.
EIZO is also on my list. Especially as they're much the same price as / less than the Apple. They're also technically better (full Adobe RGB for example.) I wonder what difference you see here @Paul McFarlane ? The resolution probably, but how's the colour being as it can be fully calibrated?

An EIZO is (when in stock!) £929 at Wex. The top of the range panel version is less than an Apple Display with fancy matt screen and flexible stand - also at Wex. I can easily see Apple Displays at Apple Stores, harder to get my eyes on an EIZO anywhere...
 
AdobeRGB is not “better” than P3. They are both virtually the same size. They are just different.
For me Adobe RGB is better. P3 was developed for cinema. Adobe RGB is industry standard for stills. Clients and designers ask me for Adobe RGB files none have ever requested P3.
 
For me Adobe RGB is better. P3 was developed for cinema. Adobe RGB is industry standard for stills. Clients and designers ask me for Adobe RGB files none have ever requested P3.
Fair enough, but I think you overestimate the importance. Having a P3 monitor does not stop you from delivering AdobeRGB images to clients. The difference between the two is very slight and only visible if you would have images that contain some maximum saturated colors. I used a P3 monitor for a long time, and never had any complaints from clients about the AdobeRGB images I delivered. Now I have a monitor that is 100% AdobeRGB and 100% P3, but there isn’t a single client who seems to have noticed a difference.
 
Absolutely, but my point is that you do not need to have an AdobeRGB monitor to deliver AdobeRGB images to clients. Photographers were supplying AdobeRGB images to their clients long before AdobeRGB monitors existed. If delivery for print is your main business, then by all means get an AdobeRGB monitor. But do not think that a P3 monitor would be ‘wrong’ in that case.
 
Adobe RGB is not automatically better than P3, even for printing. It depends on what colors the image uses, and what the final output device is. If you have to pick one, Adobe RGB is slightly better, with the difference being small enough that it sure doesn’t bother me to use a P3 display (like the one on my MacBook Pro) to edit for print.

I like Todd Dominey’s link that RoyReed posted, and also this link, because it shows that whether P3 is “worse” or “better” than Adobe RGB can depend on the colors actually used in the images:
http://www.astramael.com/

I also once wrote an article about it:
How Do P3 Displays Affect Your Workflow?

The gist is that both Adobe RGB and P3 are so much bigger than sRGB that either is a major improvement, and both cover most print colors but not all (even Adobe RGB doesn’t). Neither is ideal, as Johan said they are roughly the same size but cover slightly different color volumes. As shown by the links, Adobe RGB leans slightly toward the cool colors while P3 leans slightly the other way, neither can deliver a knockout blow to the other in terms of gamut size.

If you print to press CMYK, you really won’t see much difference because again, both are way larger than sRGB and cover most CMYK colors. Adobe RGB can be a slightly better match for the wider gamuts of high end photo inkjet printers, but still, I wouldn't panic if I got a file in P3. For the difference to be visible, you must be editing an image that has colors way out at the tip where Adobe RGB has slightly more coverage. Otherwise, if the image colors are not so extreme, those colors are probably going to be covered either way.

To deliver CMYK to clients, you can edit in RGB while soft-proofing through the correct CMYK profile, then if the client still requires that you deliver CMYK images (which is becoming much less common with modern press workflows), then you convert to that same CMYK profile. But, Adobe RGB vs Display P3 doesn’t change that, because that is exactly what we did when we only had sRGB gamut displays.
 
Status
Not open for further replies.
Back
Top