27 inch Monitor resolution 2K or 4K

fsphoto

Member
Premium Classic Member
Joined
Aug 3, 2017
Messages
29
Lightroom Experience
Intermediate
Lightroom Version
Classic
Operating System
  1. Windows 10
What are the pros and cons of a 2k vs a 4k resolution for a 27 inch monitor?
I am aware that resolution is a secondary factor, when compared to color gamut, profiling the monitor etc.
I am asking only about resolution.
 
And that is wrong, and proves that Adobe does not use the native resolution of the monitor to display the image, but the scaled resolution. The monitor setup should be irrelevant (and so it should not affect the preview size), because that setup obviously does not change the native resolution of the monitor.
I think we have drifted far enough off topic that the OPs eyes have glazed over, assuming they are still awake!
 
I think we have drifted far enough off topic that the OPs eyes have glazed over, assuming they are still awake!
I am and they have.
I do appreciate everyone's thoughts and advice.

So, let me ask another way.
You're OS is Windows, you are looking for a 27 inch monitor.
2K or 4K ?
 
You're OS is Windows, you are looking for a 27 inch monitor.
2K or 4K ?
I think Matt O'Brien has already given you an answer.

I have a question to throw back at you. Why limit yourself to 27"? 32" monitors are becoming the norm today. Personally I think 2K screens are dated. Why do you think they are so cheap? I think in the very near future the defacto monitor will be 32" HDR of 1000 or more nits brightness.
 
I was forced to replace my monitor when the previous one just stopped working. I put a lot of effort into researching best options for me. Critical for me was a high percentage conformance to AdobeRgb. Next … I had decided on 4k… as that appeared at the time to be the emerging standard. I do not shoot video, but 4k video recording is close to the norm now. My ideal setup would have been 2 x24 inch monitors, but could not find ones which matched my required spec. So, I opted for a 32 inch version… As with most things… I was moving from a 27inch screen… and now would find it difficult to go back to that size.

A factor for everyone is if you use glasses and the nature of the correction required. I can use mine without glasses if I am sitting close to the screen and with glasses if just a little bit further back. I prefer to sit in the middle distance… so sometimes I find I am putting on or taking off my glasses.

While all monitors have a shelf life… I did not want to be in remorse mode 12-18 month's later disappointed if I opted for the 2k version.

I was limited (here in Dublin Ireland) on the range of options and could not go to a store to see these devices.. but ultimately had to order from a catalog, thru a local supplier.
 
I think Matt O'Brien has already given you an answer.

I have a question to throw back at you. Why limit yourself to 27"? 32" monitors are becoming the norm today. Personally I think 2K screens are dated. Why do you think they are so cheap? I think in the very near future the defacto monitor will be 32" HDR of 1000 or more nits brightness.
From the research I have done Eizo is the best choice for me. True 10 bit color, best warranty, best reviews, etc.
They don't offer a 32 inch in my budget.
Only question remaining is 2K or 4K?
Lots of information and opinions for both resolutions being the better choice. Some recommend 2K others 4K
Was hoping to find a definitive reason to choose one resolution over the other but it seems to be down to personal preference.
 
At some point 2K monitors will no longer be sold as they are being phased out in favor of 4 K. I found this definition:
3840x2160 is "4k" (3840 ≈ 4000)
2560x1440 is "2.5k" (2560 ≈ 2500)
1920x1080 is "2k" (1920 ≈ 2000)
1749126581896.png

You need to consider more than pixels Search for a monitor that is a Wide color gamut with 99% DCI-P3 and 99% Adobe RGB coverage and an advertised brightness of at least 400 nits.

How doe EIZO compare with those specs?
 
From the research I have done Eizo is the best choice for me. True 10 bit color, best warranty, best reviews, etc.
They don't offer a 32 inch in my budget.
Only question remaining is 2K or 4K?
Lots of information and opinions for both resolutions being the better choice. Some recommend 2K others 4K
Was hoping to find a definitive reason to choose one resolution over the other but it seems to be down to personal preference.
I bought two Eizo 27" monitors a few years ago, before there were any 4K versions available and more than happy with the resolution and colour performance and the colour navigator software does a great job of looking after the calibration. I have a number of different profiles depending on what sort of work I am doing and different settings for paper types depending on the base whiteness. Remember if you are going to print the brightness settings need to be really low depending on your working environment!
 
At some point 2K monitors will no longer be sold as they are being phased out in favor of 4 K. I found this definition:
3840x2160 is "4k" (3840 ≈ 4000)
2560x1440 is "2.5k" (2560 ≈ 2500)
1920x1080 is "2k" (1920 ≈ 2000)
View attachment 26319
You need to consider more than pixels Search for a monitor that is a Wide color gamut with 99% DCI-P3 and 99% Adobe RGB coverage and an advertised brightness of at least 400 nits.

How doe EIZO compare with those specs?
I have considered many more criteria and features than pixels.
Eizo meets or exceeds all my requirements as far as color gamut etc goes.
I am just asking about 2K vs 4K resolution in a 27 inch screen monitor.
What are peoples experience and preferences, which do they prefer and why?
 
I am just asking about 2K vs 4K resolution in a 27 inch screen monitor.
What are peoples experience and preferences, which do they prefer and why?

I edit on an older 27" display that’s neither 2K nor 4K, it’s something in between: 2560 x 1440 pixels, which is still rather common. I’m still happy with it because my #1 priority is color accuracy, and it still calibrates well. I fully expect my next 27" display to be 4K or higher just because that’s more of the standard now, but 2560 x 1440 px has not held me back in any way for photo editing itself.

Because you mentioned sharpening for print, note that even if your display matches your printer dpi (e.g., 300 dpi, which many smartphone screens actually exceed now), a resolution match doesn’t guarantee perfect sharpening preview because digital displays don’t render image details the same way as a printer. Some of the traditional advice passed down (e.g., “always judge sharpening at 50% zoom”) was based on halftone screening (for a printing press) viewed on 1990s displays (72 to 110 dpi). That might not apply to how sharpening looks through FM screening (desktop inkjet printer) when previewed on a Retina (Apple) or HiDPI (Windows/Android) display. The best way to judge sharpening on screen is not to generalize from what someone else said based on a software/display/printer combination you don’t know, but instead do some test prints to understand how various amounts of sharpening look on your display for the printer you use.

Addressing your question above, the main thing that’s different between the choices you stated is the pixel density (1x, 2x…). That can be important for someone who wants to see sharper UI and text, or needs to properly preview graphics they design for 2x (Retina/HiDPI) displays. But for pure photography, pixel density is a lot lower priority than image quality (tone/color accuracy), so pixel density could be sacrificed on a limited budget.

If pixel density is a priority for you, then these are the differences you’ll see between 2K and 4K (and more):

Inches (diagonal)Panel dimensions in pxDPI (pixel density)OS default scale factorEffective UI dimensionsMegapixels
271920 x 1080 (2K)811x1920 x 1080 @1x2.1
272560 x 1440 (QHD)1091x2560 x 1440 @1x3.7
273840 x 2160 (4K)1632x (Retina/HiDPI)1920 x 1080 @2x8.3
275120 x 2880 (5K)2182x (Retina/HiDPI)2560 x 1440 @2x14.75
323840 x 2160 (4K)1382x (Retina/HiDPI)1920 x 1080 @2x8.3
326016 x 3384 (6K)2162x (Retina/HiDPI)3008 x 1962 @2x20.4

I threw in the Megapixels column because you might compare that to the megapixels in your cameras. The more they differ, the less you’ll be able to work at 1:1 and the more you’ll have to scroll or zoom. For example, if I had a 5K display it could show every pixel of my older 12MP camera at 1:1 with room left over for tools.

The first row shows that a 2K display at 27" is a rather low resolution display, because 27" is a long way to stretch out just 1920 pixels across. It can look a lot coarser than all other displays in the table, and although the effective working area is OK, it’s kind of minimal by today’s standards…not a lot of room to spread out windows and panels while leaving enough room over for the image you’re working on.

The last row represents the other extreme, the Apple Pro XDR display…when you look at the numbers you can see why Apple thought a 32" display should be 6K. We don’t have to agree with that, especially since most of us can’t afford that display, but what 6K buys you at 32" is both very large effective working area and 2x pixel density at the same time.

You can also see why Apple likes 5K for 27" displays: That results in perfect 2x pixel density of the traditional 2560 x 1440 px standard for 27".

The options you stated are the first and third rows in that table. Hopefully it will help you decide based on what kind of effective screen working area and pixel density you want. You can calculate your own numbers from many free monitor resolution calculator websites, that’s how I got those numbers.

For context, a typical 1x (legacy) desktop or laptop display is around 96–130 dpi, and a typical 2x (Retina/HiDPI) display is around 200–250 dpi.
 
I edit on an older 27" display that’s neither 2K nor 4K, it’s something in between: 2560 x 1440 pixels, which is still rather common. I’m still happy with it because my #1 priority is color accuracy, and it still calibrates well. I fully expect my next 27" display to be 4K or higher just because that’s more of the standard now, but 2560 x 1440 px has not held me back in any way for photo editing itself.

Because you mentioned sharpening for print, note that even if your display matches your printer dpi (e.g., 300 dpi, which many smartphone screens actually exceed now), a resolution match doesn’t guarantee perfect sharpening preview because digital displays don’t render image details the same way as a printer. Some of the traditional advice passed down (e.g., “always judge sharpening at 50% zoom”) was based on halftone screening (for a printing press) viewed on 1990s displays (72 to 110 dpi). That might not apply to how sharpening looks through FM screening (desktop inkjet printer) when previewed on a Retina (Apple) or HiDPI (Windows/Android) display. The best way to judge sharpening on screen is not to generalize from what someone else said based on a software/display/printer combination you don’t know, but instead do some test prints to understand how various amounts of sharpening look on your display for the printer you use.

Addressing your question above, the main thing that’s different between the choices you stated is the pixel density (1x, 2x…). That can be important for someone who wants to see sharper UI and text, or needs to properly preview graphics they design for 2x (Retina/HiDPI) displays. But for pure photography, pixel density is a lot lower priority than image quality (tone/color accuracy), so pixel density could be sacrificed on a limited budget.

If pixel density is a priority for you, then these are the differences you’ll see between 2K and 4K (and more):

Inches (diagonal)Panel dimensions in pxDPI (pixel density)OS default scale factorEffective UI dimensionsMegapixels
271920 x 1080 (2K)811x1920 x 1080 @1x2.1
272560 x 1440 (QHD)1091x2560 x 1440 @1x3.7
273840 x 2160 (4K)1632x (Retina/HiDPI)1920 x 1080 @2x8.3
275120 x 2880 (5K)2182x (Retina/HiDPI)2560 x 1440 @2x14.75
323840 x 2160 (4K)1382x (Retina/HiDPI)1920 x 1080 @2x8.3
326016 x 3384 (6K)2162x (Retina/HiDPI)3008 x 1962 @2x20.4

I threw in the Megapixels column because you might compare that to the megapixels in your cameras. The more they differ, the less you’ll be able to work at 1:1 and the more you’ll have to scroll or zoom. For example, if I had a 5K display it could show every pixel of my older 12MP camera at 1:1 with room left over for tools.

The first row shows that a 2K display at 27" is a rather low resolution display, because 27" is a long way to stretch out just 1920 pixels across. It can look a lot coarser than all other displays in the table, and although the effective working area is OK, it’s kind of minimal by today’s standards…not a lot of room to spread out windows and panels while leaving enough room over for the image you’re working on.

The last row represents the other extreme, the Apple Pro XDR display…when you look at the numbers you can see why Apple thought a 32" display should be 6K. We don’t have to agree with that, especially since most of us can’t afford that display, but what 6K buys you at 32" is both very large effective working area and 2x pixel density at the same time.

You can also see why Apple likes 5K for 27" displays: That results in perfect 2x pixel density of the traditional 2560 x 1440 px standard for 27".

The options you stated are the first and third rows in that table. Hopefully it will help you decide based on what kind of effective screen working area and pixel density you want. You can calculate your own numbers from many free monitor resolution calculator websites, that’s how I got those numbers.

For context, a typical 1x (legacy) desktop or laptop display is around 96–130 dpi, and a typical 2x (Retina/HiDPI) display is around 200–250 dpi.

Thank you for your time and effort in responding to my post.

The 2 monitors I am trying to decide between are the Eizo CS 2731 which is 2560 x 1440 resolution DPI 109, ( which I have mistakenly been referring to as 2K when it is actually 2.5K) and the Eizo CS 2740 which is 3840 x 2160 resolution DPI 164.

I do use high resolution 60 MP ( 9504 x 6336 ) cameras, so on a 4K monitor I would be able to work at 1:1 and not have to scroll or zoom as much as I would on the 2.5K monitor?

Your advice about sharpening for print is particularly informative and helpful. I never thought of sharpening in those terms, i.e. the amount of sharpening to apply to an image is best judged based on the particular combination of display and printer I am using. Upgrades to the monitor and or the printer in the future would best be followed by additional testing to find the appropriate amount of sharpening for that monitor/printer combo.

Thank you once again for your thorough and thoughtful response.
 
The 2 monitors I am trying to decide between are the Eizo CS 2731 which is 2560 x 1440 resolution DPI 109, ( which I have mistakenly been referring to as 2K when it is actually 2.5K) and the Eizo CS 2740 which is 3840 x 2160 resolution DPI 164.

I took a quick look at their specs, and yes, panel resolution seems to be the only big difference and apparently the only reason the prices are different. Because both are Eizo ColorEdge models, it should be safe to assume the color accuracy will be as good as it gets, especially if you regularly update the hardware calibration.

Because they’re so similar on paper, I’m going to say it all comes down to budget. If you’re OK with the higher price of the CS2740, then you get a great pro-level color-accurate 27" display that supports hardware calibration and has a 4K panel. For those reasons, I already think of the CS2740 as a potential replacement when my current display dies. A reason to get the CS2731 instead is if you need to save a few hundred dollars and having Retina/HiDPI resolution is not important to you; all the other specs and color quality should be the same. The CS2731 will actually have more effective screen real estate (2560 x 1440 @1x) than the CS 2470 (1920 x 1080 @2x), although the 2x pixel density will make UI and text look sharper on the CS2740.

Both are limited to 350 nits of luminance, more than enough for editing photos for print and to post on websites/social media. But if you plan to use the latest HDR editing tools in Adobe photo apps (not the same as the older HDR merge feature), the maximum brightness of these displays is too low to meet Adobe HDR requirements (such as 1000 nits sustained). If that’s not important to you, then continue choosing the CS2731 or CS2740 and either should perform well. If you know you’ll want to do Adobe HDR editing soon then look at other newer models, like maybe the Asus Pro Art displays that Cletus is very happy with.
 
The CS2731 will actually have more effective screen real estate (2560 x 1440 @1x) than the CS 2470 (1920 x 1080 @2x), although the 2x pixel density will make UI and text look sharper on the CS2740.
Nobody puts a gun against your head and forces you use 1920 x 1080 @2 resolution on a 4K monitor. You can use 2560 x 1440 on that monitor too. I already explained how MacOS deals with that. I could be wrong but I believe Windows deals with it in a similar way, if you set the monitor to the native resolution and use 150% zoom. That effectively makes it 2560 x 1440 because 3840/1.5=2560.
 
It is my understanding that image editing, sharpening for print, retouching etc is best done at 100% where 1 image pixel = 1 screen pixel.
If that is the case, wouldn't the 2.5K resolution monitor be better/ easier to edit with since at 100% zoom level the actual screen pixels are larger and easier to see? Whereas on the 4K resolution monitor the pixel pitch or actual pixels are smaller and thus when zoomed in to 100% the image is smaller and harder to see . In order to have the image be the same size on both screens I would have to zoom in more on the 4K monitor, wouldn't doing so result in interpolation i.e. 1 image pixel is displayed by more than 1 screen pixel? How much of a factor/effect does that have on editing?
Also, how much of a factor is the user's eyesight or visual acuity? It would seem to me that larger pixels might be a better experience?
While the UI and text would appear sharper on a 4K from what I have learned they would also be smaller, so does scaling up the UI as Johan suggested diminish the benefits of a 4K screen? Can you scale up the UI and not affect the image resolution in Lightroom Classic and Photoshop?
As for the new HDR capabilities in Lightroom, I don't see that as factor for me since I do not do any video and current inkjet printers and papers can't produce the contrast/tonal range for HDR.
I greatly appreciate all the help and advice from everyone, thank you all!
 
That’s debatable. A print at 300ppi has pixels that you can’t see individually. That means that judging this on screen at 100% may be more misleading than useful. Maybe viewing at 50% will give you a better idea what the print will look like. A 4K screen would actually look closer to the printed image when viewing the image at 100% than a 2K screen…
 
Back
Top