GregJ
Greg Johnson
- Joined
- Jul 11, 2011
- Messages
- 647
- Location
- San Antonio, TX
- Lightroom Experience
- Power User
- Lightroom Version
- Cloud Service
- Lightroom Version Number
- Latest Version of Classic via Adobe Cloud
- Operating System
- Windows 11
Tom's Hardware is running an article today on which GPUs work best with AI. It talks about how Artificial Intelligence and deep learning are being used in so many programs now and there is a heavy reliance on the GPU to do this. Here is my point, or question.
About 4 months ago as GPU prices started plunging, I managed to get my hands on a Founder's Edition 3080 at a great price and slapped it in my rig, replacing a 1080 card that had been languishing for 4 years there. For some reason undocumented, I really feel like this GPU made a big difference. LR just snapped better and smoother. So going from an old 1080 to a 3080 GPU really seemed to matter in LR, although I can't prove it.
It was about that same time that LR was releasing the version that used the GPU on Exports (but not on creating previews).
I suspect that Adobe has quietly without great fanfare or announcements, managed to make better use of the GPU and also make use of the newer faster more powerful GPUs in ways we might not know. I suspect it, but don't know for sure.
This is important because a lot of threads lately have wondered out loud if switching to a newer faster GPU might really help LR performance beyond just faster exports and a few other tasks, and there are many complaints of a sluggish LR by people with very old or no GPU (and also 5 & 6-year-old CPUs.
For example, we know that LR now uses AI in its powerful new masking capabilities, but I haven't read if the GPU is utilized for this new AI capability. I bet it is because almost all programs that use AI techniques use the GPU a lot.
If so, then installing a really fast GPU might help with the new masking capability that is bogging down so many machines as reported almost daily on this forum.
If so, then we may benefit from installing a 3070, 3080, or if you really want to spend, a new 40 series GeForce GPU. At what point in the GPU ladder does LR stop benefitting? (See Tom's constantly updated list of best and fastest GPUs and GPUs with the most bang for the buck.)
Since I now have a 3080 installed, and since LR just flies at warp speed on my rig, I should do some masking and see to what degree my GPU is being utilized (if any).
I bet it is since Adobe has vowed to use the GPU more and AI is a new feature.
But I'm having trouble testing it because I'm running into a little problem getting my diagnostic tool to show GPU spikes while masking in LR.
I'm sure someone somewhere has tested this (I'm not a tester).
I wish Adobe would be more forthcoming about exactly what they are doing with the better usage of the GPU. I bet they are working hard on it. For example, why not use the GPU for creating previews?
I know there is old legacy code getting in the way, but there is no way Adobe is ignoring the emergence of this profound generational leap in GPU capability that is happening right now with the release of new Nvidia 40 series cards and the new AMG GPUs.
Any comments or thoughts? Is there a tester out there in the PC world who has tested LR's use of the GPU in these new AI capabilities?
There are a lot of gamers out there who will have these new GPUs soon and who are photographers who use PS and LR.
There is also the idea that getting a good GPU as these prices fall that we are positioned ourselves for future LR releases that we know are going to use GPUs better and for more functions. You just never know when LR comes out with a new update and what they are doing with GPU usage in LR.
I'm going to build a new rig with a 4090, the fastet GPU available now and a generational leap in capability, but it will be a couple of months from now at least. I'm waiting for the 4090 prices to drop to 1599 (MSRP) from over 2 grand now.
There has to be some gamers out there who have a 4090 who run LR.
I have looked and can't find any reports online of 4090 usage and LR, and how it handles the AI masking vs much slower cards.
But first things first. Does LR use the GPU for this new AI masking?
About 4 months ago as GPU prices started plunging, I managed to get my hands on a Founder's Edition 3080 at a great price and slapped it in my rig, replacing a 1080 card that had been languishing for 4 years there. For some reason undocumented, I really feel like this GPU made a big difference. LR just snapped better and smoother. So going from an old 1080 to a 3080 GPU really seemed to matter in LR, although I can't prove it.
It was about that same time that LR was releasing the version that used the GPU on Exports (but not on creating previews).
I suspect that Adobe has quietly without great fanfare or announcements, managed to make better use of the GPU and also make use of the newer faster more powerful GPUs in ways we might not know. I suspect it, but don't know for sure.
This is important because a lot of threads lately have wondered out loud if switching to a newer faster GPU might really help LR performance beyond just faster exports and a few other tasks, and there are many complaints of a sluggish LR by people with very old or no GPU (and also 5 & 6-year-old CPUs.
For example, we know that LR now uses AI in its powerful new masking capabilities, but I haven't read if the GPU is utilized for this new AI capability. I bet it is because almost all programs that use AI techniques use the GPU a lot.
If so, then installing a really fast GPU might help with the new masking capability that is bogging down so many machines as reported almost daily on this forum.
If so, then we may benefit from installing a 3070, 3080, or if you really want to spend, a new 40 series GeForce GPU. At what point in the GPU ladder does LR stop benefitting? (See Tom's constantly updated list of best and fastest GPUs and GPUs with the most bang for the buck.)
Since I now have a 3080 installed, and since LR just flies at warp speed on my rig, I should do some masking and see to what degree my GPU is being utilized (if any).
I bet it is since Adobe has vowed to use the GPU more and AI is a new feature.
But I'm having trouble testing it because I'm running into a little problem getting my diagnostic tool to show GPU spikes while masking in LR.
I'm sure someone somewhere has tested this (I'm not a tester).
I wish Adobe would be more forthcoming about exactly what they are doing with the better usage of the GPU. I bet they are working hard on it. For example, why not use the GPU for creating previews?
I know there is old legacy code getting in the way, but there is no way Adobe is ignoring the emergence of this profound generational leap in GPU capability that is happening right now with the release of new Nvidia 40 series cards and the new AMG GPUs.
Any comments or thoughts? Is there a tester out there in the PC world who has tested LR's use of the GPU in these new AI capabilities?
There are a lot of gamers out there who will have these new GPUs soon and who are photographers who use PS and LR.
There is also the idea that getting a good GPU as these prices fall that we are positioned ourselves for future LR releases that we know are going to use GPUs better and for more functions. You just never know when LR comes out with a new update and what they are doing with GPU usage in LR.
I'm going to build a new rig with a 4090, the fastet GPU available now and a generational leap in capability, but it will be a couple of months from now at least. I'm waiting for the 4090 prices to drop to 1599 (MSRP) from over 2 grand now.
There has to be some gamers out there who have a 4090 who run LR.
I have looked and can't find any reports online of 4090 usage and LR, and how it handles the AI masking vs much slower cards.
But first things first. Does LR use the GPU for this new AI masking?