• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

Develop module New Denoise feature

Mulder

New Member
Joined
Oct 3, 2024
Messages
12
Location
Tyumen
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
14.4
Operating System
  1. macOS 14 Sonoma
Denoise is now non-destructive and can be copied as part of Details section settings.

Has anyone notices changes in speed? Is it faster or slower or what? Is it GPU-oriented as before?
 
Re the size of DNG data in LR 14.4. I did some reasonably careful testing with 112 sample raws for Sony a7 R3 (7952 x 5304).

My conclusion: The size of the Denoise data can be much larger than 5 MB and can vary by 10x for a given camera. The size primarily depends on the camera resolution and the amount of detail in the photo. ISO doesn't seem to be a big determinant in the size.

Details

I downloaded 112 sample raws for the Sony a7 R3 camera from dpreview.com. They had a wide range of scenes, lighting, lenses, and ISO.

The normal size of the .arw is about 85 MB, but if you enable Sony ARW Compression,the size is about 40 - 43 MB.

I looked at the sizes of the Denoise data stored in the .xmp sidecars, which is 25% larger than what's stored in .lrcat-data (the sidecars use the Ascii85 text encoding for binary data). The average size in .lrcat-data was 12 MB (15 MB in sidecars), but the maximum size was 45 MB (.lrcat-data) / 56 MB (.xmp sidecars):
View attachment 26589

A user in the Adobe forum reported 500 raws from the same camera, with an average .lrcat-data size of 27 MB.

The sizes don't correlate strongly with ISO:

View attachment 26590

I believe the sizes vary primarily due to Camera Raw's compression of the Denoise data. I looked carefully at perhaps 15 of the photos across the range of sizes, and the size seems to correlate well with the amount of detail in the photo, which makes sense -- the more the detail, the less you can compress without loss of quality. The largest .xmp (56 MB) was almost entirely of tiny deciduous tree branches missing leaves.

In the scatterplot above, the higher ISO photos had smaller sizes. But these photos also had much less detail -- they were taken in dark scenes, with large shadow areas missing much detail. It may be that for a given scene, higher ISOs result in larger Denoise data due to sensor noise creating more detail, but I haven't investigated that.
Thanks for the info.
 
Someone posted in the linked thread: https://community.adobe.com/t5/ligh...background-lrc/idi-p/15374596/page/4#comments as copied below.

The file bloat could become a concern for those with small hard drives - thinking MacBook Pro and Air users using denoise a lot eg for wildlife and birding where light is rarely optimal. Presumably the data file size is limitless.

lrdata.JPG
 
There are options. Like I mentioned before use a dedicated external SSD for the LrC folder. Maybe start creating annual catalogues and move the previous ones to an external drive. That option defeats the purpose of the catalogue system.
 
Someone posted in the linked thread: https://community.adobe.com/t5/ligh...background-lrc/idi-p/15374596/page/4#comments as copied below.

The file bloat could become a concern for those with small hard drives - thinking MacBook Pro and Air users using denoise a lot eg for wildlife and birding where light is rarely optimal. Presumably the data file size is limitless.
I see two things in that thread. One is the unnecessary use of XMP side car files which will grow to hold the XMP data necessary to produce a derivative of the original. Sidecar files do not replace a master catalog and are redundant to that end. What is more "bloat"? Duplicating the information stored in the Lightroom master catalog or storing that same information is a complete derivative file with headers including metadata and a modified RGB image. Second is the complaint of file bloat. All of the information is necessary to create a Denoised derivative. So it can't be called 'bloat'. The reference ro a catalog file that is only 1.78MB when the are many (most?) catalogs that are gigabytes in size.

IMO Lightroom users should not be buying computers with less than 1TB of primary storage. If you are stuck with a sub-size Primary drive and can't replace it with a larger computer or drive, you need to move all of yur Lightroom efforts to a larger EHD. For years I have been able to get by with an 8TB EHD to store my photos. In the last year I have moved from the 8TB to a 10TB to a 16TB volume for my image data. When using the 10TB , it also stored my Lightroom Folder with master catalog and Previews folder. I did some serious housekeeping and my Lightroom Folder is now back in my Pictures folder on my Primary drive. Whe I got my MacStudio, I opted for the 1TB SSD primary because at the time I kept my catalog on an EHD with my images. I wish now that I had gone for the 2TB SSD. as I struggle to keep enough free space on the primary disk for working storage and temporary files.
 
I see two things in that thread. One is the unnecessary use of XMP side car files which will grow to hold the XMP data necessary to produce a derivative of the original. Sidecar files do not replace a master catalog and are redundant to that end. What is more "bloat"? Duplicating the information stored in the Lightroom master catalog or storing that same information is a complete derivative file with headers including metadata and a modified RGB image. Second is the complaint of file bloat. All of the information is necessary to create a Denoised derivative. So it can't be called 'bloat'. The reference ro a catalog file that is only 1.78MB when the are many (most?) catalogs that are gigabytes in size.

IMO Lightroom users should not be buying computers with less than 1TB of primary storage. If you are stuck with a sub-size Primary drive and can't replace it with a larger computer or drive, you need to move all of yur Lightroom efforts to a larger EHD. For years I have been able to get by with an 8TB EHD to store my photos. In the last year I have moved from the 8TB to a 10TB to a 16TB volume for my image data. When using the 10TB , it also stored my Lightroom Folder with master catalog and Previews folder. I did some serious housekeeping and my Lightroom Folder is now back in my Pictures folder on my Primary drive. Whe I got my MacStudio, I opted for the 1TB SSD primary because at the time I kept my catalog on an EHD with my images. I wish now that I had gone for the 2TB SSD. as I struggle to keep enough free space on the primary disk for working storage and temporary files.
Good points. I've like XMP files and always considered them clutter. On other sites there is confusion about needing them. I never thought of the 1TB SSD other than Apple charing too much for it. My travel MB Air M3 that I just got with 512GB will be fine. I actually normally get 256 but I waited too long and ran out of time before a trip for a special order. Might have been a blessing. I just got a Mac Mini M4 with 512. I'll see how things go over the next few years. With all of feedback Adobe may come up with something to help with that.
 
The file bloat could
With respect to the total disk usage of the old and new Denoise, it helps to separate out the issues, which have been conflated in all the angry yelling in the Adobe thread:

- The total space used by the new Denoise storing its data in the catalog's .lrcat-data file is typically much less than the old Denoise DNG files, as much as 75% less or more, depending on the camera and amount of detail in the photos. This reduction in disk space is partly responsible for the new Denoise being modestly faster than the old one, since the new Denoise is writing fewer bytes to disk. (Disk I/O is only a small part of the total execution time of Denoise.)

- With the old DNG Denoise, you could easily move the denoise data to drives other than the one storing the catalog folder. But with the new Denoise, the data must be stored in the catalog folder. This could obviously be a problem for people storing their catalog on a relatively small internal drive of a laptop. For some use cases, e.g. a volume professional photographer shooting mostly low-light or high ISO scenes who denoises most of their photos, their catalog folder could quickly get one or two orders of magnitude larger.

- When you've enabled saving metadata to disk (Automatically Write Changes To XMP), the denoise data gets stored in three places rather than one: in the .lrcat-data file (correct), in the photo files or their .xmp sidecars (correct), and in the catalog's .lrcat file (incorrect, a bug). There are two nasty symptoms of that bug:

a. The catalog folder gets even larger, a problem for people with limited space, as discussed above.
b. LR's mechanism for writing metadata to disk in the background chokes, leaving LR mostly unuseable until all the denoise data has been saved. I think some of this choking is caused by the extra nonstop writes to the catalog file.

I don't think this choking is caused by the SQLite relational database used to store the catalog (the .lrcat file), which can scale to very large sizes without a significant impact on most database operations. Rather it's the mechanism that LR has layered on top of that for managing the saving of metadata to sidecars that is at fault. For many years, it's operated smoothly and silently, but in the last couple of years Adobe has struggled with bugs with it. Incorrectly writing that data into the catalog database at the same time as writing it to the .xmp sidecars has brought it to its knees.
 
With respect to the total disk usage of the old and new Denoise, it helps to separate out the issues, which have been conflated in all the angry yelling in the Adobe thread:

- The total space used by the new Denoise storing its data in the catalog's .lrcat-data file is typically much less than the old Denoise DNG files, as much as 75% less or more, depending on the camera and amount of detail in the photos. This reduction in disk space is partly responsible for the new Denoise being modestly faster than the old one, since the new Denoise is writing fewer bytes to disk. (Disk I/O is only a small part of the total execution time of Denoise.)

- With the old DNG Denoise, you could easily move the denoise data to drives other than the one storing the catalog folder. But with the new Denoise, the data must be stored in the catalog folder. This could obviously be a problem for people storing their catalog on a relatively small internal drive of a laptop. For some use cases, e.g. a volume professional photographer shooting mostly low-light or high ISO scenes who denoises most of their photos, their catalog folder could quickly get one or two orders of magnitude larger.

- When you've enabled saving metadata to disk (Automatically Write Changes To XMP), the denoise data gets stored in three places rather than one: in the .lrcat-data file (correct), in the photo files or their .xmp sidecars (correct), and in the catalog's .lrcat file (incorrect, a bug). There are two nasty symptoms of that bug:

a. The catalog folder gets even larger, a problem for people with limited space, as discussed above.
b. LR's mechanism for writing metadata to disk in the background chokes, leaving LR mostly unuseable until all the denoise data has been saved. I think some of this choking is caused by the extra nonstop writes to the catalog file.

I don't think this choking is caused by the SQLite relational database used to store the catalog (the .lrcat file), which can scale to very large sizes without a significant impact on most database operations. Rather it's the mechanism that LR has layered on top of that for managing the saving of metadata to sidecars that is at fault. For many years, it's operated smoothly and silently, but in the last couple of years Adobe has struggled with bugs with it. Incorrectly writing that data into the catalog database at the same time as writing it to the .xmp sidecars has brought it to its knees.
This. It's one thing to keep the images - original and DNG derivative - on an external drive, another to keep the LR cat and other LR data files on an EDD. I guess it's another workflow change we'll get used to.
 
- When you've enabled saving metadata to disk (Automatically Write Changes To XMP), the denoise data gets stored in three places rather than one: in the .lrcat-data file (correct), in the photo files or their .xmp sidecars (correct), and in the catalog's .lrcat file (incorrect, a bug). .
I was not aware of the third storage location. But this is an important statement about XMP. That should not be ignored

A file contains header information metadata and data block(s)
An XMP sidecar file carries with it the additional overhead of having its own file header, where as the DNG only needs header information stored once for each DNG. And a Catalog only has one header for all of the information contained in it.

One thing that bothers me is running Lightroom desktop locally without the cloud component. All of that edit information and metadata gets stored in a separate XMP file and not compactly inside a single catalog database file. It seems to me that those local Lightroom Desktop users are going to be in for a surprise when their tiny laptop disk drives fill up making the laptop unusable.


Sent from my iPad using Tapatalk
 
Re the size of DNG data in LR 14.4. I did some reasonably careful testing with 112 sample raws for Sony a7 R3 (7952 x 5304).

My conclusion: The size of the Denoise data can be much larger than 5 MB and can vary by 10x for a given camera. The size primarily depends on the camera resolution and the amount of detail in the photo. ISO doesn't seem to be a big determinant in the size.

Details

I downloaded 112 sample raws for the Sony a7 R3 camera from dpreview.com. They had a wide range of scenes, lighting, lenses, and ISO.

The normal size of the .arw is about 85 MB, but if you enable Sony ARW Compression,the size is about 40 - 43 MB.

I looked at the sizes of the Denoise data stored in the .xmp sidecars, which is 25% larger than what's stored in .lrcat-data (the sidecars use the Ascii85 text encoding for binary data). The average size in .lrcat-data was 12 MB (15 MB in sidecars), but the maximum size was 45 MB (.lrcat-data) / 56 MB (.xmp sidecars):
View attachment 26589

A user in the Adobe forum reported 500 raws from the same camera, with an average .lrcat-data size of 27 MB.

The sizes don't correlate strongly with ISO:

View attachment 26590

I believe the sizes vary primarily due to Camera Raw's compression of the Denoise data. I looked carefully at perhaps 15 of the photos across the range of sizes, and the size seems to correlate well with the amount of detail in the photo, which makes sense -- the more the detail, the less you can compress without loss of quality. The largest .xmp (56 MB) was almost entirely of tiny deciduous tree branches missing leaves.

In the scatterplot above, the higher ISO photos had smaller sizes. But these photos also had much less detail -- they were taken in dark scenes, with large shadow areas missing much detail. It may be that for a given scene, higher ISOs result in larger Denoise data due to sensor noise creating more detail, but I haven't investigated that.
Wow, thank you again for that deep test. Just one question, are compressed a7 R3 raws lossy or lossless? Because that might be another factor which might influence the AI denoise.

As example, in my a6700 I have both options for compression (but not uncompressed raws, although I suppose lossless must be absolutely equal to uncompressed). And I don't remember exactly how or when they change, but "normally" a6700 lossy raws are 12-bit in addition to being lossy, but lossless raws are always 14-bit. (I know this because as a LOOOONG discussion in dpreview forums, if you want your a6700 to perform much better about rolling shutter distortion while shooting with electronic shutter, the thumb rule is going to either jpg or lossy raws with much faster readout speed versus a nearly always negligible quality drop).

Maybe both factors, (the lossy compression and bit depth drop) can influence the "noise data" the picture contains and therefore the lrcat-data extra needed space.

As a side note: as I already posted, in a much, much, much swallower test I made with my 6700 with real set of 38 indoors children basketball pictures, (remember a6700 is 6192 x 4128), they took 1.35 GB (36-37 MB/arw as average) which went to extra 2.5 GB in denoinsed DNGs (67 MB/DNG) which decreased to just about 200 MB in lrcat-data (5 MB/picture)... Let's say 10% of the original "used/wasted" space.

Again, I cannot understand how or why people complain about the "amount of space" the denoise data uses. It's much less than the older dng one and much less than the pictures themselves. If you don't have enough disk space in your system, (I don't care if it's a Mac, or whatever), you don't have enough space because digital photography is quite storage hungry. Period. You'll have to analyze your whole approach to the subject... (And quite probably the solution is not so difficult: just buy an external drive, they're pretty cheap and small nowadays no matter if either HDD or SSD).

(And you should know that digital video is even more storage hungry, in case you also get surprised by it. Please forgive me if I sound too sarcastic, but I just cannot believe anybody complains against the new behaviour but about the background/foreground behaviour change. Or bugs).

- When you've enabled saving metadata to disk (Automatically Write Changes To XMP), the denoise data gets stored in three places rather than one: in the .lrcat-data file (correct), in the photo files or their .xmp sidecars (correct), and in the catalog's .lrcat file (incorrect, a bug). There are two nasty symptoms of that bug:
And again, thank you for your insight. I don't use sidecards but knowing they're currently buggy is always handy.
 
I noticed that too. I had to test a few times as it sometimes added when I thought it should have removed and vice versa. It appears the new Denoise process only adds about 1MB per 30MB file in lrcat-data which is no where near the 50MB DNG's it used to create. Not much to worry about.
I am sorry but I am getting a little tired of the argument that it "is no where near the 50MB DNG's it used to create". I have never understood why anyone would want to keep those DNGs. I used them to generate a JPEG and then threw them away. After all, if needed you could always denoise the original again.
 
I am sorry but I am getting a little tired of the argument that it "is no where near the 50MB DNG's it used to create". I have never understood why anyone would want to keep those DNGs. I used them to generate a JPEG and then threw them away. After all, if needed you could always denoise the original again.
Well, you can also do exactly the same with the new behaviour. Just delete history and, sooner or later, (remember it's not immediate), when you optimize your catalog, the AI denoise data will be purged from lrcat-data.

That was exactly my main concern with the new behaviour, that lrcat-data just seemed to grow. But no, sometimes it shrinks... (My happy surprise about it)

And even if it weren't the case, then it would be clear that it would be A BUG, just like the one John R Ellis has posted 5 messages before this one.
 
I am sorry but I am getting a little tired of the argument that it "is no where near the 50MB DNG's it used to create". I have never understood why anyone would want to keep those DNGs. I used them to generate a JPEG and then threw them away. After all, if needed you could always denoise the original again.
Interesting point. While the DNG was not a RAW file but with Jpeg XL compression there were some concerns. Often called destructive denoising. I read articles and about it and didn't really see much difference but I'm not an expert in that area. I know I didn't like the 200MB TIFF files it produced. before Adobe reduced the size.

It has been interesting. I'm on several forums and a lot of people including myself could hardly wait to get rid of the DNG system. If you denoise first as was suggested with the DNG process then you no longer work on a RAW file for the rest of the edits. Again I'm no expert and not sure how much difference it makes but it sure sounds more appealing to me than working with the DNG. I don't see the point of the RAW file if right away it becomes a compressed Jpeg XL. Also I prefer being able to tweak the denoised file as often as I like with instant results. Cletus made a good point being able to choose the amount before applying it. I can't see that being a tough fix.
 
Cletus made a good point being able to choose the amount before applying it. I can't see that being a tough fix.
Being able to vary the amount after the Denoise is computed is a plus since you do not need to recompute the denoise each time you want to adjust the intensity. However I am wishing for a user set default instead of an arbitrary 50.
 
Being able to vary the amount after the Denoise is computed is a plus since you do not need to recompute the denoise each time you want to adjust the intensity. However I am wishing for a user set default instead of an arbitrary 50.
Being able to vary the amount after the Denoise is computed is a plus since you do not need to recompute the denoise each time you want to adjust the intensity. However I am wishing for a user set default instead of an arbitrary 50.
That is why I said it can't be that tough. All the values are computed.
 
Denoise is now non-destructive and can be copied as part of Details section settings.

Has anyone notices changes in speed? Is it faster or slower or what? Is it GPU-oriented as before?
There are a few things I noticed while using this new feature:

1) Denoise now works slightly faster. One member mentioned approximately 20 seconds per image. I tested it on my Macbook Pro M1 Max with 32 GB and the new feature was on average 1-1.5 seconds faster per image. On 1,000 images that saves you approximately 15 minutes.

2) In my view, the time required per image is more dependent on the hardware specifications you are using. I recently purchased a Mac Studio M4 with 128 GB . That machine takes approximately 8 seconds per image with the old method and 6-7 with the new process.

3) The process remains mainly GPU driven, and that explains the increased speed of the M1 Max vs the Mac Studio M4 Max (24 vs 40 GPU cores).

4) The files indeed are a bit bigger now but then again the "starting files" are bit smaller (converting NEF to DNG in itself makes the file larger. I havent noticed any significant difference in total storage required

5) The quality is, as far as I can tell, slightly (very slightly) improved except for one major difference. With the old process, the generative denoise process was either on or off. With the new process there is a slider which allows you to vary the strength/impact of denoising. I find that very useful.

But what's missing (at least I couldn't find it) is the option to apply it only to a masked section of the image. Example: sometimes I have an image where i want denoise of the background to be stronger than denoise of the subject. But I still haven't figured out how to do that.

Thanks

Jurry de Vries
 
But what's missing (at least I couldn't find it) is the option to apply it only to a masked section of the image. Example: sometimes I have an image where i want denoise of the background to be stronger than denoise of the subject. But I still haven't figured out how to do that.
This is a good idea but something that is not (yet) implemented. While many basic develop features are included in the Mask, Denoise is not one of them. If you submit a feature request to Adobe I would add my support.
 
There are a few things I noticed while using this new feature:

1) Denoise now works slightly faster. One member mentioned approximately 20 seconds per image. I tested it on my Macbook Pro M1 Max with 32 GB and the new feature was on average 1-1.5 seconds faster per image. On 1,000 images that saves you approximately 15 minutes.

2) In my view, the time required per image is more dependent on the hardware specifications you are using. I recently purchased a Mac Studio M4 with 128 GB . That machine takes approximately 8 seconds per image with the old method and 6-7 with the new process.

3) The process remains mainly GPU driven, and that explains the increased speed of the M1 Max vs the Mac Studio M4 Max (24 vs 40 GPU cores).

4) The files indeed are a bit bigger now but then again the "starting files" are bit smaller (converting NEF to DNG in itself makes the file larger. I havent noticed any significant difference in total storage required

5) The quality is, as far as I can tell, slightly (very slightly) improved except for one major difference. With the old process, the generative denoise process was either on or off. With the new process there is a slider which allows you to vary the strength/impact of denoising. I find that very useful.

But what's missing (at least I couldn't find it) is the option to apply it only to a masked section of the image. Example: sometimes I have an image where i want denoise of the background to be stronger than denoise of the subject. But I still haven't figured out how to do that.

Thanks

Jurry de Vries
The curent method does not offer masking. I always using the Masking slider in the Detail panel. I further Denoise the background by opening masking and selecting the background. I reduce Texture, Sharpening and increase the Noise slider (manual - non AI). I don't touch Clarity. I can wipe out noise completely for a silky smooth finish. Great for screen media but for print a little background noise helps prevent posterization.

The video is bit old but relevant. Masking is near the end. I was doing this before I found this video. That was because I was inspired by his two step LrC to PS video set which I purchased. You make a virtual copy, edit one for the subject, the other for the background and send both PD for blending. That method became obsolete since advanced making was introduced. There may other applications for it I'm not aware of.

https://www.youtube.com/watch?v=NMYh5grOBa0
 
The Original Denoise that created a DNG Allowed you to choose an intensity level before you started. It also remembered the last setting. So if you usually use 80, the Denoise was defaulted to the last remembered setting. I am just annoyed that they changed the Process to a fixed "50" instead of the last remembered as it was.
While it bothers me, I am not yet bothered enough to create my own Custom Profile
I have been having many issues with the length of time it takes to even move the slider - regardless of whether I do it for a single file or batch several files at once. While I can tolerate having to find other things to do during the initial Denoising, sometimes it will take just as long to change the setting from 50 to 67. This has severely impacted my workflow.
 
I have been having many issues with the length of time it takes to even move the slider - regardless of whether I do it for a single file or batch several files at once. While I can tolerate having to find other things to do during the initial Denoising, sometimes it will take just as long to change the setting from 50 to 67. This has severely impacted my workflow.

DeNoise defaults to 50 on the slider. You can not change that until DeNoise completes. Only then can you move the slider.It moves easily for me after the DeNoise process has finished.


Sent from my iPad using Tapatalk
 
DeNoise defaults to 50 on the slider. You can not change that until DeNoise completes. Only then can you move the slider.It moves easily for me after the DeNoise process has finished.


Sent from my iPad using Tapatalk
Instant results tweaking the slider after the initial Denoise process for me as well.
 
DeNoise defaults to 50 on the slider. You can not change that until DeNoise completes. Only then can you move the slider.It moves easily for me after the DeNoise process has finished.


Sent from my iPad using Tapatalk
I am glad I was able to find this thread, as you touched on exactly what I was going to post a question about. I had a two part question and you have answered the second part. As I got to know LRC DeNoise and my camera's different ISO setting noise levels, I often knew what DeNoise setting I wanted and would set it for my first run. We can't do this now (so glad I upgraded my 7 year old computer; Denoise now takes about 6 seconds vs 6-11 minutes!).

Given that I generally know what setting I want, and it is seldom 50, is it possible somewhere in some setting to indicate I want a different, more probable, default value (say, 40)? This way, the first, automatic, run of DeNoise has a higher probability of being one I would want to use.

Thanks
 
DeNoise defaults to 50 on the slider. You can not change that until DeNoise completes. Only then can you move the slider.It moves easily for me after the DeNoise process has finished.


Sent from my iPad using Tapatalk
Perhaps I did not make myself clear. Once the initial Denoise completes (during which time I am unable to continue working in Lightroom Classic), when I try use the slider to change the level, it sometimes causes Lightroom to go into a black hole, before I have even set a new number. Other times, when I am able to set the new level, say to 67, for example, it still can take as long to apply the new setting as it did to initially Denoise the file(s). The fact that it works for you is great. However, in my case, I am having considerable performance issues, on my iMac with an M1 chip running Sequoia 15.5.
 
Perhaps I did not make myself clear. Once the initial Denoise completes (during which time I am unable to continue working in Lightroom Classic), when I try use the slider to change the level, it sometimes causes Lightroom to go into a black hole, before I have even set a new number. Other times, when I am able to set the new level, say to 67, for example, it still can take as long to apply the new setting as it did to initially Denoise the file(s). The fact that it works for you is great. However, in my case, I am having considerable performance issues, on my iMac with an M1 chip running Sequoia 15.5.
That sounds like you are a little light in the computer specs department.

Here's what Adobe says are the minimums:
https://helpx.adobe.com/lightroom-cc/system-requirements.html

However, to run the latest AI features, the recommended specs are closer to the real minimum.

Ideally you'll need a Silicon processor with 32GB of unified memory lots of CPUs and GPUs OR the Intel/Nvidia equivalent to a silicon chip.
What are your specs and how do they compare?
 
Back
Top