• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Does LrC have any kind of Command startup /switches ?

Status
Not open for further replies.

Ken @ Canadian Rockies

Active Member
Premium Classic Member
Joined
Mar 19, 2016
Messages
147
Location
Canadian Rocky Mountains
Lightroom Experience
Advanced
Lightroom Version
Classic
I have had problems with all versions of LrC. Speed and graphics wise. I am running a Threadripper 2950x system with 128 GB RAM. Sometimes I feel having 128 GB ram has created problems. I just updated my video card, (now 352-bit width, instead of 256) hoping to fix the graphical weirdness in Lightroom and P.S. Nope - still there. Mostly problem is with Min - Maximizing. Elements take so long to propagate on screen.

Below is a snag of my memory - and what is being used. VM is more than the real RAM. I wish I could tell LrC to stop using or limit VM. Hard to do for sure.

Oh, and how can I tell Adobe my screen is not 168 dpi, but 163 - by math? This might be the problem.

1650725618739.png
 
These numbers are reported to LrC from the OS.
I think (?) the Virtual Memory is the Swapfile. Lightroom is paging in and out memory that it has reserved but not using so that other apps can have access to real memory. I don’t think there are many apps that will access more than a fraction of that 128GB. You can probably run many many apps with that 128GB but few if any will consume more than 64GB

As for that screen size, the 168 dpi is the value reported to the OS by one of your 3 displays.


Sent from my iPad using Tapatalk
 
Yes, makes sense. With a few Adobe apps and DXO, (with many large photos opened) I have seen just over 71 gigs being used. I was surprised. But when doing video editing, it goes way up with the rendering time. But not much past 86 ever. I know I could have stuck with 64,, but it was just a because thing.. lol

I've played a lot with windows VM swap file, but usually caused more grief than benefit. Oh well.

Thanks Cletus
 
I've played a lot with windows VM swap file, but usually caused more grief than benefit. Oh well.

Thanks Cletus
With 128 GB of physical memory, how often does Windows need to use the swap file.

Me, I have "only" 32 GB of RAM. I have used some Windows control panel settings to create a permanent 64 GB for a swap file. That avoids allocation and de-allocation issues. It also allows me to put the swapfile on drive D or E, etc.
 
With 128 GB of physical memory, how often does Windows need to use the swap file.

Me, I have "only" 32 GB of RAM. I have used some Windows control panel settings to create a permanent 64 GB for a swap file. That avoids allocation and de-allocation issues. It also allows me to put the swapfile on drive D or E, etc.
Well, no matter if I set Processor Scheduling to Background or Programs - priority, Windows always sets up an 8 GB swap file. Crazy to me, but I have read lots on this subject and it does not matter how much RAM you have; windows OS have been built to assign many processes to use the VM. Hopefully one day we can see this get a bit better. My VM is on my M.2 drive, but still....

My video rendering has sped up a lot. No slow down on my PC when doing renders now at all. Very nice.
 
With 128 GB of physical memory, how often does Windows need to use the swap file….
I think that Windows ALWAYS creates a swap file and apps will offload parts of code that are not being needed or actively used by the app. A LrC example might be the parts of the app that handles Printing, Mapping, Book Modules etc. These don’t need to remain in active memory until the user calls for them.


Sent from my iPad using Tapatalk
 
With 128 GB of physical memory, how often does Windows need to use the swap file.
I am curious about this also. I remember fine tuning this when memory was scarce, but also space on slow spinning c drives was scarce and 4GB was a lot of memory.

My assumption is that Photoshop uses some of its defined cache drives or raw cache when it gets stuck for memory and the Lightroom Raw converter uses the relevant cache defined within Lr. So, forgot about these virtual memory Windows settings. Also, I installed 64 GB of memory, leaving the option to upgrade to a total of 128GB, but my instinct is that I have not seen much benefit from upgrading from 32GB to 64GB.
 
I am curious about this also. I remember fine tuning this when memory was scarce, but also space on slow spinning c drives was scarce and 4GB was a lot of memory.

My assumption is that Photoshop uses some of its defined cache drives or raw cache when it gets stuck for memory and the Lightroom Raw converter uses the relevant cache defined within Lr. So, forgot about these virtual memory Windows settings. Also, I installed 64 GB of memory, leaving the option to upgrade to a total of 128GB, but my instinct is that I have not seen much benefit from upgrading from 32GB to 64GB.
Do you also make videos? And if so, your system hardly noticed the extra ram?
 
No…. I do not create or edit videos. I know that would have demands on memory. I opted for a motherboard with the possibility of 128 GB of memory, installed 64GB, with the objective of improving Lr performance. I cannot prove it, but my instinct tells me that I have seen no improvement going from 32GB to 64 GB so have no incentive to upgrade further to 132GB.

I will check the Windows virtual memory settings tomorrow. I remember in the past configuring those so that the location and size was fixed, but probably at Windows default values now.
 
Sometimes I feel having 128 GB ram has created problems.
I would be surprised if that much memory is your problem. Your resource list did not list all of your plug-ins. I'm wondering if one of those is causing problems. Have you looked at LightroomQueen's performance blog for ideas?
I've played a lot with windows VM swap file, but usually caused more grief than benefit. Oh well
I'd be careful here. There is a difference between VM management and SWAPing. VM will utilize the paging file to free up memory. 'Swapping' is normally a term when a low used process needs to be removeed to free up RAM. I believe Windows has a paging file and not a swap file. Here's an article on setting the optimum page file.

There is a 'dance' between an application like LrC and the virtual memory system of Windows.
My assumption is that Photoshop uses some of its defined cache drives or raw cache when it gets stuck for memory and the Lightroom Raw converter uses the relevant cache defined within Lr.
@Gnits I would not suspect any app would be implementing it's own memory management system these days. Cache may be used to pass large amounts of information, like an image, from one app to another in the background.

So, I still suspect the issue @Ken @ Canadian Rockies is having is likely other than memory.
 
I would be surprised if that much memory is your problem. Your resource list did not list all of your plug-ins. I'm wondering if one of those is causing problems. Have you looked at LightroomQueen's performance blog for ideas?
Yes, I either deleted the plugin, or disabled pretty much all mine for this reason. Thanks

I'd be careful here. There is a difference between VM management and SWAPing. VM will utilize the paging file to free up memory. 'Swapping' is normally a term when a low used process needs to be removeed to free up RAM. I believe Windows has a paging file and not a swap file. Here's an article on setting the optimum page file.
I will check this out. Thanks. I was just talking about the "Advanced" Virtual Memory paging file. I've called it the Swap file forever.. maybe I need to change this way of speaking.
 
So, I still suspect the issue [@Ken @ Canadian Rockies[] is having is likely other than memory.
I also suspect this. When I had a simple AMD and nVidia 960 card, I never saw - what I see now. It's like a have a bad MB or that the version 2 of Threadripper is shit. And needed to end . . . so they ended it.

It has been an on-going problem for over two years since I bought this dead-end system.
 
I've played a lot with windows VM swap file, but usually caused more grief than benefit. Oh well.
I'd be careful here. There is a difference between VM management and SWAPing. VM will utilize the paging file to free up memory. 'Swapping' is normally a term when a low used process needs to be removeed to free up RAM. I believe Windows has a paging file and not a swap file. Here's an article on setting the optimum page file.

There is a 'dance' between an application like LrC and the virtual memory system of Windows.

On current versions of Windows and macOS, there is a third variable that isn’t always accounted for in discussions, in addition to traditional real RAM and VM swap, and that is compressed memory. Compressed memory has become just as important ever since CPUs got fast enough to compress/decompress RAM in less time than it would take to swap it to and from storage. Memory management is now highly dynamic, constantly shuffling data among real, compressed, and swap memory. Attempts to manually intervene are unlikely to keep up with that.

For example, if memory demands are not overwhelming, and if compressing RAM would be faster than swapping to the current swap file location, Windows/macOS may compress before swapping, to save time. So, sometimes, when trying to account for memory usage and not finding it all in real RAM or the swap file, it may be sitting in compressed RAM. The process monitor utilities on both Windows and macOS report compressed RAM usage along with real RAM and VM swap.

Also, VM usage can be high even if current memory needs are not high, if applications decide to retain older memory contents in the swap file instead of purging it. And some applications such as Photoshop sometimes decide to pre-allocate RAM and swap space in anticipation of near future needs, again not necessarily reflecting what is being edited right now. That is sometimes why users complain that an application is using a lot of memory before they’ve done anything…it may have been pre-allocated so it’s ready to go.
 
On current versions of Windows and macOS, there is a third variable that isn’t always accounted for in discussions, in addition to traditional real RAM and VM swap, and that is compressed memory. Compressed memory has become just as important ever since CPUs got fast enough to compress/decompress RAM in less time than it would take to swap it to and from storage. Memory management is now highly dynamic, constantly shuffling data among real, compressed, and swap memory. Attempts to manually intervene are unlikely to keep up with that.

For example, if memory demands are not overwhelming, and if compressing RAM would be faster than swapping to the current swap file location, Windows/macOS may compress before swapping, to save time. So, sometimes, when trying to account for memory usage and not finding it all in real RAM or the swap file, it may be sitting in compressed RAM. The process monitor utilities on both Windows and macOS report compressed RAM usage along with real RAM and VM swap.

Also, VM usage can be high even if current memory needs are not high, if applications decide to retain older memory contents in the swap file instead of purging it. And some applications such as Photoshop sometimes decide to pre-allocate RAM and swap space in anticipation of near future needs, again not necessarily reflecting what is being edited right now. That is sometimes why users complain that an application is using a lot of memory before they’ve done anything…it may have been pre-allocated so it’s ready to go.
Thanks for this Conrad. Interesting information I have never seen or read about. But I am getting old...
Now I see the In use (Compressed) ... in my Task Manager. Still learning..

1650947738627.png
 
I have 128gb and a Threadripper (3970X which is 32 real cores). I have been disappointed in lightroom's performance on it, not because it is particularly slow, but because it did not get a lot faster when i changed from a 4/8 core machine and much less memory. Classic has gotten better over the years in parallelism, but it is not good. I think the architecture and Lua combine to make it difficult for LR to actually use all the resources modern systems can through at it.

Relative to paging: There are a lot of misleading statistics in windows, not that they are wrong, but they are misleading. The number of interest to see if you are memory constrained for a (normal) program is "Hard Faults". Go to task manager, performance, then Open Resource Monitor (bottom), and on that window go to Memory. It shows all the processes and look at the column for "Hard Faults".

Hard Faults are normal when a program is starting -- most programs use that mechanism to read the program into memory from disk. So you need to get Lightroom up and running for a while, and be doing whatever your normal work is, so that it is all in memory and ready. Then as you work, e.g. build a bunch of previews or exports or whatever, watch to see if hard faults has significant numbers there (say more than a dozen or so over time).

You have to look at it in the context of how much free memory is available (it shows right under that). If free gets small, you have memory constraints. If free is large and you still have significant hard faults, it is Adobe's fault, they are not making use of available memory efficiently.

Below (first screen shot) is mine running a 1:1 preview build of a folder not recently touched. It's consuminga nice 46GB, and faulting just slightly (it appears to be the cause of "MsMpEng" faulting as well which is Microsoft's anti-malware engine which kicked in probably from all the new files being accessed).

But here's an example (2nd screen shot) of Adobe's parallelism working well. I am running a preview build of 3900 images. Notice the CPU graph is darn close to 100% all the time. That's with 64 cores, so it parceled out those images to a LOT of threads.

But if you do some jobs where things seem to go really slowly, and watch the CPU and disk and other settings you will see times when Lightroom is running very slowly, and nothing is busy -- CPU's will be near idle, disks will be near idle, etc. The cause there is usually that it is single streamed, only one (of your MANY) CPU cores is actually being used, which reflects on the CPU busy time as almost negligable in use. But since it is single threaded, that is all that it can do. Those are when Lightroom really pokes along, and it's not a system thing, it is an Adobe architecture thing when it happens.

Lightroom is OLD code, OLD platform. It dates from when having two cores was impressive, not 64. Adobe has done a lot to improve the spaghetti mess it must be under the covers, but there is a LOT to be done.


hard.jpg

cpu.jpg
 
I have 128gb and a Threadripper (3970X which is 32 real cores). I have been disappointed in lightroom's performance on it, not because it is particularly slow, but because it did not get a lot faster when i changed from a 4/8 core machine and much less memory. Classic has gotten better over the years in parallelism, but it is not good. I think the architecture and Lua combine to make it difficult for LR to actually use all the resources modern systems can through at it.

Relative to paging: There are a lot of misleading statistics in windows, not that they are wrong, but they are misleading. The number of interest to see if you are memory constrained for a (normal) program is "Hard Faults". Go to task manager, performance, then Open Resource Monitor (bottom), and on that window go to Memory. It shows all the processes and look at the column for "Hard Faults".

Hard Faults are normal when a program is starting -- most programs use that mechanism to read the program into memory from disk. So you need to get Lightroom up and running for a while, and be doing whatever your normal work is, so that it is all in memory and ready. Then as you work, e.g. build a bunch of previews or exports or whatever, watch to see if hard faults has significant numbers there (say more than a dozen or so over time).

You have to look at it in the context of how much free memory is available (it shows right under that). If free gets small, you have memory constraints. If free is large and you still have significant hard faults, it is Adobe's fault, they are not making use of available memory efficiently.

Below (first screen shot) is mine running a 1:1 preview build of a folder not recently touched. It's consuminga nice 46GB, and faulting just slightly (it appears to be the cause of "MsMpEng" faulting as well which is Microsoft's anti-malware engine which kicked in probably from all the new files being accessed).

[snip]

Thank you Linwood, for your incredibly detailed information. I am familiar with Hard faults and have looked on and off at the different things the resource monitor can give us. But not while doing what you suggest. Good idea! I will have to check this out. It is very hard to get my CPU usage up above 17 or 46% on any given tasks. I have seen it go high, but rare.
 
Status
Not open for further replies.
Back
Top