• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

LR Use of AI and thus the GPU

Not open for further replies.


Greg Johnson
Jul 11, 2011
San Antonio, TX
Lightroom Experience
Power User
Lightroom Version
Cloud Service
Lightroom Version
Latest Version of Classic via Adobe Cloud
Operating System
  1. Windows 11
Tom's Hardware is running an article today on which GPUs work best with AI. It talks about how Artificial Intelligence and deep learning are being used in so many programs now and there is a heavy reliance on the GPU to do this. Here is my point, or question.
About 4 months ago as GPU prices started plunging, I managed to get my hands on a Founder's Edition 3080 at a great price and slapped it in my rig, replacing a 1080 card that had been languishing for 4 years there. For some reason undocumented, I really feel like this GPU made a big difference. LR just snapped better and smoother. So going from an old 1080 to a 3080 GPU really seemed to matter in LR, although I can't prove it.
It was about that same time that LR was releasing the version that used the GPU on Exports (but not on creating previews).
I suspect that Adobe has quietly without great fanfare or announcements, managed to make better use of the GPU and also make use of the newer faster more powerful GPUs in ways we might not know. I suspect it, but don't know for sure.
This is important because a lot of threads lately have wondered out loud if switching to a newer faster GPU might really help LR performance beyond just faster exports and a few other tasks, and there are many complaints of a sluggish LR by people with very old or no GPU (and also 5 & 6-year-old CPUs.
For example, we know that LR now uses AI in its powerful new masking capabilities, but I haven't read if the GPU is utilized for this new AI capability. I bet it is because almost all programs that use AI techniques use the GPU a lot.
If so, then installing a really fast GPU might help with the new masking capability that is bogging down so many machines as reported almost daily on this forum.
If so, then we may benefit from installing a 3070, 3080, or if you really want to spend, a new 40 series GeForce GPU. At what point in the GPU ladder does LR stop benefitting? (See Tom's constantly updated list of best and fastest GPUs and GPUs with the most bang for the buck.)
Since I now have a 3080 installed, and since LR just flies at warp speed on my rig, I should do some masking and see to what degree my GPU is being utilized (if any).
I bet it is since Adobe has vowed to use the GPU more and AI is a new feature.
But I'm having trouble testing it because I'm running into a little problem getting my diagnostic tool to show GPU spikes while masking in LR.
I'm sure someone somewhere has tested this (I'm not a tester).
I wish Adobe would be more forthcoming about exactly what they are doing with the better usage of the GPU. I bet they are working hard on it. For example, why not use the GPU for creating previews?
I know there is old legacy code getting in the way, but there is no way Adobe is ignoring the emergence of this profound generational leap in GPU capability that is happening right now with the release of new Nvidia 40 series cards and the new AMG GPUs.
Any comments or thoughts? Is there a tester out there in the PC world who has tested LR's use of the GPU in these new AI capabilities?
There are a lot of gamers out there who will have these new GPUs soon and who are photographers who use PS and LR.
There is also the idea that getting a good GPU as these prices fall that we are positioned ourselves for future LR releases that we know are going to use GPUs better and for more functions. You just never know when LR comes out with a new update and what they are doing with GPU usage in LR.
I'm going to build a new rig with a 4090, the fastet GPU available now and a generational leap in capability, but it will be a couple of months from now at least. I'm waiting for the 4090 prices to drop to 1599 (MSRP) from over 2 grand now.
There has to be some gamers out there who have a 4090 who run LR.
I have looked and can't find any reports online of 4090 usage and LR, and how it handles the AI masking vs much slower cards.
But first things first. Does LR use the GPU for this new AI masking?
Tom's Hardware is running an article today on which GPUs work best with AI.
Just wanted to add this snip from Adobe about GPU usage. The answer is yes. The GPU is being heavily used by the new AI masking. The 3080 I installed also helps me because I have a 32 inch 4K IPS pro monitor which gets help from the GPU in many ways.

From Adobe:


So, it is pretty obvious that I'm benefitting from the 3080 GPU in many ways. A lot of you having problems with the masking stuff should pay attention to this.
I just found a tester that was a gamer and LR user. He wanted to know if his GeForce 3080 was remarkably faster in LR than if he used only his integrated graphics on the CPU. He tested both and found that the 3080 added about 15% across the Board in the Dev module with much faster exports when compared to turning off the GPU and using only the inegrated graphics of a 12th Gen Intel i9 CPU.

But he also postulated that going from a 3060 to a 3080 was not that much gain in LR. He thought that LR could do a better job of using the highest end GPUs to get better acceleration across the board. But he made the point that using a 3060 vs just integrated graphics or nothing at all is a dramatic improvement, and remember, a 3060 is a fast GPU compared to what most people probably have on this Board.

The point is, moving to the very fastest gaming cards (3080, 3090 and the new 40 series) may not be that much gain (10%?) when using something like a 3060 (which is still a very fast card).

So that fellow last week who was asking if he will get much out of LR switching to a 3060 from an old 4-year-old card ... the answer is a resounding yes. Now, moving from a 3060 to my 3080? Maybe not that much. But you never know what Adobe is up to in terms of GPU usage and Excelleration, so one must thin that Adobe is not ignoring the generational leap in GPU power that is occurring right now in the Nvidia 40 series and new AMD GPUs coming out now.
We all have differing need. For what it's worth, this is my situation.
About a year ago, I built a system based on an i7 12th gen with integrated graphics and 32GB of DDR5 RAM. I have the option of adding a discrete GPU but wanted to avoid it if possible. My gaming days are long over. At first there were issues. It was looking like a discrete GPU was inevitable. With Windows 11 as well, things were very new. Driver, BIOS and system updates were coming through regularly. Masking still sucked. Then with LrC V12.1 everything fell into place. It has been running like a dream ever since. The integrated GPU runs at less than 10% on everything except Super Resolution where it maxes out. However, I can still do a 20MB image in about 50 seconds. That's plenty fast enough for me since I rarely use Super Resolution.
Yes, it seems likely that there will be greater use of AI and GPU power. However, if I needed to upgrade, I'm tempted to first go to an i9 13th+ gen or the max the motherboard will allow, so as to get the added benefits of a big CPU. After that, I'd look at something like a 3060 GPU which should be quite cheap by then.
That's a good option Bob. A 3060 will be cheap by then for sure, as will a 3070 or even a 3070 Ti. I don't think 13th gen Raptor Lake is a big gain from your 12th gen CPU as far as big generational; leaps go. 12th Gen CPU rocks.
The point is, LR is utilizing your integrated graphics, and the integrated graphics on your 12th Gen chip is really good. But the tester I saw was saying that a 3060 GPU will provide you a nice leap up in LR GPU capability and acceleration compared to integrated graphics. Apparently, LR really benefits up to around a 3060-level of GPU. Then a 10% gain up to a still expensive (800 dollar) 3080.
Man ... when you build again someday, add a GPU even if you don't game.
I don't think 13th gen Raptor Lake is a big gain from your 12th gen CPU as far as big generational; leaps go.
You are right. I've seen a report 13th gen gives only a 25% improvement over 12th gen. However, it's already out. I hope I won't need an upgrade for some time yet. 15th gen sounds good ;)
Adobe was once proud of LrC not needing a GPU but lately it has been talking up GPUs. The writing is on the wall I guess. Adding a discrete GPU is no problem. It's just plug and play but I don't want it if I don't need it yet. I'd rather put the money towards another lens :p
Last edited:
Not open for further replies.