• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

GPUs for lightroom

Status
Not open for further replies.

alaios

Active Member
Joined
Jun 27, 2014
Messages
459
Lightroom Experience
Beginner
Lightroom Version
Lightroom Version Number
latest updated weekly
Operating System
  1. Windows 10
Hi all,
does lightroom (and to similar extend photoshop) benefit from fast gpus?

For example I am thinking this card
GTX 1660ti

will that be exploited nicely 7from lightroom?
Regards
Alex
 
This is the information that can be found at Adobe:
12520

https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html
 
This list is out of date for nVidia cards. For the 1000 series or even for the earlier 900 series, at what point does spending more money for a faster card or a card with more VRAM reach the point of diminishing returns. That is, what card would provide such fast screen display times that there is no point to spending more money for a faster card?

Phil Burton
 
Screen display is only a very minor part in case of Lightroom. Lightroom is not a game with super fast screen action. The real reason to want a fast GPU for Lightroom is that more and more edit calculations are sent off to the GPU rather than being carried out by the CPU (if that option is turned on in the preferences, that is).
 
I had a chat with Apple support.
 

Attachments

  • Untitled.jpeg
    Untitled.jpeg
    29 KB · Views: 350
(stuff trimmed out)
The real reason to want a fast GPU for Lightroom is that more and more edit calculations are sent off to the GPU rather than being carried out by the CPU (if that option is turned on in the preferences, that is).
Johan,

Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?

Are you also saying that generating previews in LIBRARY does not involve the GPU?

Phil Burton
 
I'm getting a new iMac in about 3 months so I'm interested in this as well.
 
Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?
Are you also saying that generating previews in LIBRARY does not involve the GPU?
I don’t know all the details to answer that with much authority. It’s a combination of things, so you see the difference more if you work on a 4K or 5K screen. The bigger the screen, the more pixels need to be re-rendered with each edit. One example where the effect is very clear is Enhanced Details. That is really slow if you don’t have a fast GPU. The CPU does not seem to do much at all when you create an Enhanced Details DNG. The Library module is also using the GPU nowadays, so I would think that generating previews might also be a bit speedier with a fast GPU. The biggest bottleneck would be disk speed however, because rendering a lot of previews means reading a lot of raw data from disk.
 
Related to the graphics card question. I have a NVIDIA GeForce GTX 750 Do not you guys see a reason to upgrade to a newer card? If yes to which one?
I also do games so I will buy a card for both purposes, still good though to be a card that lightroom and photoshop can benefit from.
Alex
 
Johan,

Given what you just said, what kinds or amounts of edits on an image would tip the balance towards a faster GPU?

Are you also saying that generating previews in LIBRARY does not involve the GPU?

Phil Burton
Generally speaking it is just for display purposes like moving a slider, though Adobe has started to add more. The new Enhance Details feature , I think, was the first place they did non-live work in a GPU (and there I think they allow multiple GPU's to be available). I have not heard that previews use it. Yet. Maybe they will, maybe they never will.

Buying a GPU is partly an exercise in speculation - how soon, and in which direction, will Adobe make use of it. If you do not need one for general use, my suggestion is delay buying a very high end GPU and get a relatively cheap mid-range. If Adobe makes it much more useful, you are not out much to buy another. In fact, if they follow the trends in video editing they might let you just add one (or two or three).
 
Generally speaking it is just for display purposes like moving a slider, though Adobe has started to add more. The new Enhance Details feature , I think, was the first place they did non-live work in a GPU (and there I think they allow multiple GPU's to be available). I have not heard that previews use it. Yet. Maybe they will, maybe they never will.

Buying a GPU is partly an exercise in speculation - how soon, and in which direction, will Adobe make use of it. If you do not need one for general use, my suggestion is delay buying a very high end GPU and get a relatively cheap mid-range. If Adobe makes it much more useful, you are not out much to buy another. In fact, if they follow the trends in video editing they might let you just add one (or two or three).
If/when I do buy another GPU, I will want one that supports Enhanced Details. I read somewhere that the nVidia 1600 model does not support this feature.

Multiple GPUs just for editing in Lightroom? At some point, for a non-gamer the money might better be spent on a faster CPU or maybe more RAM, or a bigger/faster SSD.

Phil
 
Multiple GPUs just for editing in Lightroom? At some point, for a non-gamer the money might better be spent on a faster CPU or maybe more RAM, or a bigger/faster SSD.
That's absolutely true today. Whether it will be in Adobe's future - who knows. There's a lot to like about the parallelism in GPU's. At their core (pun intended) CPU's are meant for sequential calculations, and struggle to do things efficiently in parallel. GPU's are the reverse. One wonders if there was a real standard for GPU's, so they all could run the same code, if we would not be a lot further down this pike. Instead the Adobe's of the world have to try to stay compatible with several competing mechanisms, INCLUDING systems with no usable GPU at all.

Will one day Adobe cross a divide and say "we require a GPU compatible with X or the product will not run at all"? It would certainly substantially simplify their coding, and I think we would see substantially faster progress.

Now... people just stay frustrated trying to find the right answer of hardware, drivers, settings, etc.
 
That's absolutely true today. Whether it will be in Adobe's future - who knows. There's a lot to like about the parallelism in GPU's. At their core (pun intended) CPU's are meant for sequential calculations, and struggle to do things efficiently in parallel. GPU's are the reverse. One wonders if there was a real standard for GPU's, so they all could run the same code, if we would not be a lot further down this pike. Instead the Adobe's of the world have to try to stay compatible with several competing mechanisms, INCLUDING systems with no usable GPU at all.

Will one day Adobe cross a divide and say "we require a GPU compatible with X or the product will not run at all"? It would certainly substantially simplify their coding, and I think we would see substantially faster progress.

Now... people just stay frustrated trying to find the right answer of hardware, drivers, settings, etc.
All good points you make here. However, I doubt that there will be a real standard for GPUs unless AMD/ATI goes out of business. Considering how much more powerful GPUs get with every generation, I'm going to guess that sooner or later, a $200 GPU will be sufficiently powerful for significant performance improvement, and then Adobe might start to require one in a target system.

Today Adobe recommends 12 GB for Lightroom, but an awful lot of laptops are sold with only 4 or 8 GB. We are currently shopping for a laptop for my wife and for her needs, which do not include LR, a machine with 8 GB of RAM and 256 of SSD storage is more than adequate. So if you want to use a laptop for Lightroom, you should look for machines with 16 GB (or more). They will cost easily $300 and maybe $500 more than the machine my wife will be buying.
 
They will cost easily $300 and maybe $500 more than the machine my wife will be buying.

So what percentage is $300 of all the camera equipment, lenses, bags, tripods, and similar you own. ;)
 
So what percentage is $300 of all the camera equipment, lenses, bags, tripods, and similar you own. ;)
If I count it all up, $300 is just a small percentage. That's rational thinking. But prices can also be very "emotional," as in, "How much am I planning to spend on this new (or upgraded) computer?" Or, "I can't spend more than $1500 total on this new computer., because my significant other will think I'm being extravagent." If you look at the Puget Sound computer website, their Lightroom workstations cost a lot more than $1500.
 
Status
Not open for further replies.
Back
Top