• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • It's Lightroom update time again and there's some big feature improvements as well as new lenses and bug fixes! See this blog post for Lightroom Classic and this blog post for the Lightroom Cloud Ecosystem changes.

LR Use of GPU - Big Improvement on V11 of Classic.

Joined
Jul 11, 2011
Messages
243
Location
San Antonio, TX
Lightroom Experience
Power User
Lightroom Version
Cloud Service
Lightroom Version
Classic latest Version 11
Operating System
  1. Windows 11
The LR 11 release notes say "If your GPU RAM is above 8 GB, GPU acceleration for export is enabled by default. However, if the GPU RAM is less than 8 GB, then select Edit > Preferences > Performance > Use Graphics Processor > Custom > Use GPU for Export." So that means LR is using the GPU for exports now. That is great news.

I think this is important because for the first time in over two years we can buy GPUs at close to MSRP because of the Crypto crash and miners not buying up all the GPUs. I just bought an RTX 3080Ti for MSRP and slapped it in my PC which had languished with an old GPU because of the massive GPU shortage and super high prices. PC builders are buying GPUs now at MSRP and even below and this LR update works wonders with using the GPU better. I ran a couple of simple "tests" on exports and generating previews to see how and if the GPU kicks in.

I exported 100 full-size jpegs from 200MB Fuji GFX raw files in LR and watched the behavior of the CPU and GPU in some monitoring tools. The export job went much faster than ever before. Why? Because the GPU was engaged. The CPU does not max out load like it does on generate previews - it spikes each second on every jpeg export to around 50-60% load. The GPU engages to a mid-level degree and temps climb a bit. But the point is, the GPU is kicked in and is working on the export job and it goes much faster and CPU works less hard. They work together. This is great news because LR has always had a reputation for not utilizing GPUs enough.

I generated 800 1:1 Previews -- I ran a test job and generated 800 1:1 previews. As always in LR, building previews is a CPU-heavy job and the CPU stays at 100% load for the whole job. It doesn't spike but stays there at 100% load for the entire job. Temps climb pretty high. The GPU does not engage at all.

Does anyone know if Adobe is planning to utilize the GPU for building previews as they have now for the export function?

LR just keeps getting better and faster and if you have a newer PC or new to newer laptop at mid-level or above, LR flies at warp speed. That is very refreshing because for ten years I have listened to people complain about LR speed on the photography blogs and forums. Not here. I'm not here very much. I'm talking on the photography forums where the Adobe LR vs other raw editors wars are waged and I'm always defending LR because it is the best.

 
Joined
Sep 29, 2007
Messages
23,664
Location
Isle of Wight, UK
Lightroom Experience
Power User
Lightroom Version
Cloud Service
Does anyone know if Adobe is planning to utilize the GPU for building previews as they have now for the export function?

Adobe doesn't pre announce their plans, but their focus is on improving performance, so that seems a likely candidate to move to GPU.
 
Top