• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Graphics card for a 4k monitor

Status
Not open for further replies.

tangled mentation

New Member
Joined
Sep 17, 2012
Messages
19
Location
Palmerston North
Lightroom Experience
Intermediate
I know some of this has been covered before but I need some further advice.

I recently upgraded my monitor to a lovely, wonderful etc, etc Dell UP3216Q, which runs natively at 3840x2160 (i.e. 4k). It cleared out the bank but the images are great, so much so that a professional photographer mate of mine (and mentor) brings stuff over to view on it.

But my copy of LR (2015.7) can run alarmingly slowly, especially as I have a well-specced PC with 32GB of Ram (an i7-4770 running at 3.5Ghz). It has recently dawned on me that one of the major issues may be the upgrade to a 4k monitor (duh, I hear you all crying).

I have a Geforce GTX 960m, which is a pretty good card. The simple question is, would upgrading to a newer card, one of the 10 series, produce a useful improvement? If so, do I need to go for a (more expensive) 1070 or 1080 or would the (relatively) less expensive 1060 do the trick?

Given that more 4k monitors are appearing, I suspect this may be an increasingly common question. I would be grateful for some advice (and my pockets are not deep enough to go out and get a new graphics card without good reason. I have been busy implementing all of the excellent advice in Victoria's recent publications on optimising LR - a great resource. Hope this makes sense. I'm writing late in the evening, as usual.
 
It would be worth asking what specifically is alarmingly slow, as the new GPU isn't a magic ticket for everything.

There are some Adobe recommendations here, which might help. Adobe Lightroom GPU Troubleshooting and FAQ

The GTX 960M looks like a mobile GPU - did I find the wrong one? What are the specs?
 
Hi Victoria,

Thanks for the reply. It is indeed worth asking what is specifically slow. When I'm doing an editing session and it goes on for a while the edit slows down considerably. I understand from your (very clear) explanations that part of this is the iterations that lightroom goes through when you have made a number of changes to an image in the same session. But it does seem that the graphics side of things does also struggle, with the screen either greying out or pixelating when I try to refresh the image (for instance changing from developed to library modes).

I have the GTX 960 card. The GTX 960 M is the mobile version. I append the specifications. I also have included the rundown on my system via help in lightroom.

The Adobe recommendations are pretty conservative and fairly old, it seems to me. The new generation of 10 series Nvidia cards are much better than the previous versions, using less power and being more powerful.

The new GTX 1070 is 50% more powerful than that the 'legendary'GTX 970, which is in turn much faster than my 960 (essentially a cutdown version of that card, I understand). The 1060, which is a cutdown version of the 1070 is pretty fast too!

I invested in a 4K screen as my old screen basically died and I wanted to future proof myself. I suspect that photographers will increasingly be using 4K screens, so this maybe a question that other photographers will be asking. I suspect that the increase in processing power of the newer cards, designed specifically for 4K screens, will improve lightroom. But I am not well-heeled enough just to go and base my pretty good graphics card without something more than a feeling. Of course, I am asking for opinions and understand that they are just that, and not tablets handed down from the Mount!

I remain very interested in your opinion your opinion and others reading the forum.

Best wishes


Lightroom version: CC 2015.7 [ 1090788 ]
License: Creative Cloud
Operating system: Windows 10
Version: 10.0
Application architecture: x64
System architecture: x64
Logical processor count: 8
Processor speed: 3.4 GHz
Built-in memory: 32709.8 MB
Real memory available to Lightroom: 32709.8 MB
Real memory used by Lightroom: 579.3 MB (1.7%)
Virtual memory used by Lightroom: 531.1 MB
Memory cache size: 265.5 MB
Maximum thread count used by Camera Raw: 8
Camera Raw SIMD optimization: SSE2,AVX,AVX2
System DPI setting: 144 DPI (high DPI mode)
Desktop composition enabled: Yes
Displays: 1) 3840x2160
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No

Graphics Processor Info:
GeForce GTX 960/PCIe/SSE2

Check OpenGL support: Passed
Vendor: NVIDIA Corporation
Version: 3.3.0 NVIDIA 372.90
Renderer: GeForce GTX 960/PCIe/SSE2
LanguageVersion: 3.30 NVIDIA via Cg compiler


Application folder: C:\Program Files\Adobe\Adobe Lightroom
Library Path: F:\Lightroom Catalogue\Main Catalog SWVC-2-2.lrcat
Settings Folder: C:\Users\Stephen Coppinger\AppData\Roaming\Adobe\Lightroom

Installed Plugins:
1) AdobeStock
2) DxO OpticsPro 10
3) DxO OpticsPro 10 Importer
4) Envira
5) Facebook
6) Leica Tether Plugin
7) LR/Instagram
8) Nikon Tether Plugin
9) Photo Upload
10) ViewBug

Config.lua flags: None


graphic card spec.jpg
 
I'm going to hand over the floor to some of our Windows guys who are more up to date with specific hardware than I am. Puget Systems have been doing a fair bit of testing with Lightroom and different machine specs, so they could be worth contacting. Recommended System: Recommended Systems for Adobe Lightroom
 
TBH, I'm a little surprised at the thought that the 960 can't handle a 4k monitor, it's still not exactly a lightweight (I have one myself in my Win10 system, but not using a 4k monitor). However, the screen greying out or pixelating when changing modules doesn't sound good, and certainly sounds GPU-related. Normally we'd suggest turning off the GPU Processor option in LR Preferences, but in reality you need it for that 4k monitor. One thing you could try before spending big bucks on a graphics card upgrade would be to disable hyper-threading to see if that helps at all. Running LR on an HT-enabled system can be a bit of a mixed bag, somethings are a tad faster, others are slower....so it might be worth a try.

Other than that, maybe try disabling some of your plug-ins (as a test), they can sometimes interfere. If all else fails, and you decide to go ahead with an upgraded graphics card, the problem is that there'll be no guarantees that the upgrade will fix the problem. I assume that you didn't have any similar problems before the monitor upgrade?
 
It might be worth testing a different - preferably high quality - DisplayPort cable (which I assume you use to connect the monitor). I have seen so many examples of low quality DP cables, resulting in anything from no picture at all to issues when waking up from sleep mode. A 4k monitor is demanding because is pretty much maxes out the bandwidth of the DisplayPort 1.2 connector of your card. Just a thought, and certainly cheaper than buying a new card.
 
Thanks very much for the comments. A little lightbulb went off in my head with Robert's comment. I am on my second DP cable because the first one produced lots of problems. Clearly, something for me to consider deeply. Having said that, as I say, I am already on my second cable. The first one was a long cable due to the configuration of my office and could not handle the new monitor.

Jim, your comments are interesting. I will have to Google how to disable hyper- threading. Of course, the problem is that I use most of my plug-ins, having disabled the ones I don't use. So I am in the Catch-22 situation and that it might solve the problem but at the expense of utility.

It seems clear that there is no guarantee that upgrading my card will produce a dramatic difference. I have perhaps over-stated my problems too. My system is not too shabby and runs pretty well. It's only when I go into Demon mode in the develop module that things start to unravel a bit.

I went on to the Puget systems site as recommended by Victoria. No mention of the 10 series cards so I think the jury remains out.

Thanks again for the comments – very helpful
 
Hi Robert,

I suspect you are right. My existing card is a good one and so I'm not rushing off to get a better one. Maybe when they start to fall in price (or someone does the experiment and reports back)!
 
Status
Not open for further replies.
Back
Top