• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • 12 December 2024 It's Lightroom update time again! See What’s New in Lightroom Classic 14.1, Mobile & Desktop (December 2024)? for Feature updates, new cameras and lenses, and bug fixes.

Develop module Denoise Time?

russellsnr

Active Member
Joined
Dec 13, 2011
Messages
148
Location
UK
Lightroom Experience
Intermediate
Lightroom Version
6.x
Lightroom Version Number
14
Operating System
  1. macOS 15 Sequoia
Hi, On a MAC Mini M2 16gb.
On for example a 4000 ISO image how long approximately should it take Denoise to work on that image, I am getting an approximate time stated by the software as 2 minutes but using the stopwatch option of my phone I am getting start to finish approx 1min 10sec is this the normal time for an image of 4000 ISO please?
Thank You, Russ.
 
Hi, On a MAC Mini M2 16gb.
On for example a 4000 ISO image how long approximately should it take Denoise to work on that image, I am getting an approximate time stated by the software as 2 minutes but using the stopwatch option of my phone I am getting start to finish approx 1min 10sec is this the normal time for an image of 4000 ISO please?
Thank You, Russ.

I think DeNoise time estimates are based on the number of Pixels not the amount of noise. I believe the Process works in finding edges and in smoothing adjacent pixels.

The estimates are often off (usually taking longer than the estimate)

Another factor depends on your processor. I have an M2 Ultra with 64GB of unified storage and my 48mp Nikon images are estimated at 8 seconds, Lately taking a little longer than that to produce a completed DNG.


Sent from my iPad using Tapatalk
 
For Windows users, the duration is highly dependent on the AI specific capabilities of the GPU. To get a more reliable and representative duration, I like to run Denoise on 10 different images at a time and divide by 10. Why 10? It's easier ;)
 
For Windows users, the duration is highly dependent on the AI specific capabilities of the GPU. To get a more reliable and representative duration, I like to run Denoise on 10 different images at a time and divide by 10. Why 10? It's easier ;)
I tried that on 5 images. Individually the image estimate was 8 sec. Collectively, the estimate was 40 sec. Actual time 60sec
 
There are two main factors:
  1. The number of pixels in the image, such as 24 megapixels vs 48 megapixels. More pixels takes longer.
  2. The power of the GPU.
You’re using an Apple Silicon Mac where, within the same processor generation (such as M2), AI Denoise time generally scales in a linear way with the number of GPU cores. So for a given number of pixels, for example the base M2 Mac mini GPU with the 10-core GPU should take almost twice as long to denoise as the M2 mini with the GPU upgraded to 19 cores.

On YouTube, ArtIsRight regularly re-tests various scenarios with new Macs, and this week he posted his video that included the M4 Macs. The frame below is from that video at 15:03. (I added the yellow arrows and text.) The M2 Mac mini with 10 GPU cores denoised his 36 megapixel test file in 62 seconds, while the M2 Mac Studio with 60 GPU cores did it in a predictable 13 seconds because the M2 Ultra level processor has 6 times as many GPU cores as the M2 base level. You can look down the test chart and see how Denoise time tracks with the number of GPU cores.

If your time is 1 minute 10 seconds, and you have the 10-core GPU, it sounds like maybe that’s a higher megapixel file than his, like maybe 45 megapixels?

ArtIsRight-Lightroom-Classic-14-Denoise-M4.jpg


(This explanation is for Macs only. Denoise time works differently on PC GPUs.)
 
On my 2019 iMac Intel, 64GB RAM and 8GB VRAM about 33 seconds for 24 to 32 MP flies. My 2020 MacBook Air M1, 16GB RAM takes about 75 seconds. Both devices are running Sequoia and latest versions LrC.
 
There are two main factors:
  1. The number of pixels in the image, such as 24 megapixels vs 48 megapixels. More pixels takes longer.
  2. The power of the GPU.
You’re using an Apple Silicon Mac where, within the same processor generation (such as M2), AI Denoise time generally scales in a linear way with the number of GPU cores. So for a given number of pixels, for example the base M2 Mac mini GPU with the 10-core GPU should take almost twice as long to denoise as the M2 mini with the GPU upgraded to 19 cores.

On YouTube, ArtIsRight regularly re-tests various scenarios with new Macs, and this week he posted his video that included the M4 Macs. The frame below is from that video at 15:03. (I added the yellow arrows and text.) The M2 Mac mini with 10 GPU cores denoised his 36 megapixel test file in 62 seconds, while the M2 Mac Studio with 60 GPU cores did it in a predictable 13 seconds because the M2 Ultra level processor has 6 times as many GPU cores as the M2 base level. You can look down the test chart and see how Denoise time tracks with the number of GPU cores.

If your time is 1 minute 10 seconds, and you have the 10-core GPU, it sounds like maybe that’s a higher megapixel file than his, like maybe 45 megapixels?

View attachment 24854

(This explanation is for Macs only. Denoise time works differently on PC GPUs.)
Many thanks, yes my MAC is 10 GPU cores and Z8 45MP so about right. Thanks again. Russ.
 
There are two main factors:
  1. The number of pixels in the image, such as 24 megapixels vs 48 megapixels. More pixels takes longer.
  2. The power of the GPU.

(This explanation is for Macs only. Denoise time works differently on PC GPUs.)

Excellent explanation and video link.


Sent from my iPad using Tapatalk
 
Back
Top