• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Benchmark available for testing Lightroom Classic performance on Windows computers

Status
Not open for further replies.

Roelof Moorlag

Senior Member
Joined
Jan 27, 2013
Messages
1,468
Location
Netherlands
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic CC
Operating System
  1. Windows 10
So is anyone interested in sharing the beta results? I ran it and got the below. Note the last step appears to hang (removing DNG's if I recall), but I just left for a while and it eventually finished. It also takes quite a while to run -- start it and go to dinner or something.
The differences in Develop Module brush lag times between the three file types are significant... and brutal for poor D850 owners...
 
But you know, I just went from a D800 as my largest file to a Sony A7Riv, which is 61mpix and 120MBs, and while it is not exactly zippy, I was quite pleasantly surprised to find it did not respond all that differently. Preview builds were substantially slower, but editing was not nearly as bad as I expected.

The one thing I've found over the years working with Lightroom is that it is a relative of Murphy -- whenever you think you understand it and know what's going to happen -- BAM -- hits you up side the head with a surprise.
 
If it wasn't for the fact that I am using Lightroom for DAM only (and don't have access to the develop module) I would have taken the benchmark to see how my upgraded PC would go .

I suppose I could create a dummy Adobe account to get a new 7 day trial.
 
But you know, I just went from a D800 as my largest file to a Sony A7Riv, which is 61mpix and 120MBs, and while it is not exactly zippy, I was quite pleasantly surprised to find it did not respond all that differently. Preview builds were substantially slower, but editing was not nearly as bad as I expected.

That is interesting... I notice(d) a very definite difference between the way Lightroom and my laptop coped with the D800E files compared to the increased file size of the D850. I wonder of the .NEF format has something to do with it?

It may simply be because my laptop is now at the low end of what is feasible for Lightroom use, while my camera is being used to stress test computers...
 
The benchmark test on my laptop finally completed...
15" SurfaceBook 2 (2017)
i7-8650 CPU, 16 GB RAM, 1TB SSD

1577406230324.png


It doesn't really tell me what the problem is... other than being SLOW....

Christelle
 
The more I look at this, the less value it holds for me. This may be the result of being in the software industry during the database benchmark wars. The outcome of which was positive with an industry standard based on a close to meaningful scenarios as possible.

Any benchmark will be dependent on the version of the software being tested, the hardware being used, and what else the OS is having to deal with (e.g. Antivirus etc). Adobe is increasing the frequency of releases so what would that mean to something like Convert to DNG if LR increased the level of parallel processing based on an improvement of the algorithm? This means you need to have these released based on changes to LR.

Under the heading of 'your mileage may vary' a lot of these are one off steps. I am not frequently switching from LIBRARY to DEVELOP so who cares if I gain a few seconds in a different configuration. What I would like to see are the times to incrementally change the HIGHTLIGHT control while viewing both the image and histogram with CLIPPING ENABLED. Oh, and I use a SECONDARY DISPLAY.

Unlike other benchmarks I've been been around, there is no justification for these specific tests. Who did they consult? Who has validated them?

The danger?I expect the results from running the tests on your configuration will have a lower rating than Purget's leading you to believe your system is under-performing for LR. This may be completely false.
 
The more I look at this, the less value it holds for me. This may be the result of being in the software industry during the database benchmark wars. The outcome of which was positive with an industry standard based on a close to meaningful scenarios as possible.

Any benchmark will be dependent on the version of the software being tested, the hardware being used, and what else the OS is having to deal with (e.g. Antivirus etc). Adobe is increasing the frequency of releases so what would that mean to something like Convert to DNG if LR increased the level of parallel processing based on an improvement of the algorithm? This means you need to have these released based on changes to LR.

Under the heading of 'your mileage may vary' a lot of these are one off steps. I am not frequently switching from LIBRARY to DEVELOP so who cares if I gain a few seconds in a different configuration. What I would like to see are the times to incrementally change the HIGHTLIGHT control while viewing both the image and histogram with CLIPPING ENABLED. Oh, and I use a SECONDARY DISPLAY.

Unlike other benchmarks I've been been around, there is no justification for these specific tests. Who did they consult? Who has validated them?

The danger?I expect the results from running the tests on your configuration will have a lower rating than Purget's leading you to believe your system is under-performing for LR. This may be completely false.
You have a point. Many years ago I remember computer magazine using tools like Pcmark and 3Dmark to do all the testing.

I believe that there was a module in Pcmark (back in the days) that tested performance of Adobe products.

I see many 3Dmark tests being done but not Pcmark.
 
A bad benchmark is worst than no benchmark IMHO.
 
A bad benchmark is worst than no benchmark IMHO.
While I understand that sentiment in some ways, almost any consistent benchmark can be used as an investigative tool. That mine was 54 and Christelle's was 29 does not necessarily mean mine is nearly twice as fast, but if she (for example) added memory and three specific steps got much faster, we perhaps learn those are more sensitive to memory. Conversely, if nothing changes, we may learn something as well.

What Puget has done is provide some data and a controlled way to run tests we could not before -- it was always easy to time (say) 500 smart previews. But not to scroll through the develop module, something I have always found incredibly annoying (e.g. you want to crop and straighten a bunch of shots).

Now collecting data will be pointless if the test is still a moving target also, e.g. if V0.9 gives significantly different results than V0.8. Or if any version gives significantly different results when run again and again.

But I think we are wrong to discount this because it is not a diagnostic tool that yields a specific recommendation.

After all, we almost quite literally have nothing at present.
 
So is anyone interested in sharing the beta results?
I am. But I am traveling right now, and the results are on my desktop.
As I recall, my scores (4 runs) were about 730, with a std of 3%.
This is on a desktop with specs similar to what Puget Systems recommended about 2 years ago.
I'll post my results on this Sunday after I return.

And I agree, the biggest gains for us will be to see a change in a measured system when one component is upgraded.
I think I also learned something when I watched my Windows Resource monitor as it plowed through the tests.

Jim
 
I’m a few years past the stage where I would upgrade my computer because a benchmark test told me it was slower than the 98th percentile of fastest gaming machines on the market...

Seeing that an i9-9900 with 64 GB is used as the 100-score benchmark, I didn’t expect particularly impressive numbers (especially after @Ferguson scored mid 50s), and given the pathetic performance I’ve been experiencing...

Comparing our numbers shows that something is not quite right, though. Based on a direct comparison, my laptop is better than his at doing auto WB and tone adjustment on the D850 files... Either practice makes perfect(!) or there is something else at play — given that I have about 9000 D850 raw files in my main catalogue (which should not impact the test). My laptop has never seen a Sony raw file before this test.

Christelle
 
So is anyone interested in sharing the beta results? I ran it and got the below. Note the last step appears to hang (removing DNG's if I recall), but I just left for a while and it eventually finished. It also takes quite a while to run -- start it and go to dinner or something.

View attachment 13675
Sure. Here's my result, which I find surprising in comparison to yours. The overall score is almost identical, although we get there in rather different ways, but our specs certainly aren't....in theory I could substantially upgrade my CPU, GPU and quadruple my RAM (with double the speed), and that would make no difference? I find that difficult to believe, which would tend to suggest that something else is going on here.

LrBenchResults_12-18-14-07.jpg
 
@Jim Wilde,
I find your results quite telling...
Your machine has the same amount of RAM as mine, and a lower spec CPU, but it achieves almost double the overall performance of my laptop — expect for Auto settings on D850 raw files, where I’m still the champion :)

That made me have a look and realise my CPU has been running at 1.9 GHz (or that is the reported speed), instead of 4.0 GHz... I wonder if this has changed, and explains the sudden loss of performance? Overheating would cause the CPU to downrate... Is there a way to easily track the actual CPU speed during the test (short of running the benchmark inside and outside the fridge)?
 
@Jim Wilde,
I find your results quite telling...
Your machine has the same amount of RAM as mine, and a lower spec CPU, but it achieves almost double the overall performance of my laptop — expect for Auto settings on D850 raw files, where I’m still the champion :)

That made me have a look and realise my CPU has been running at 1.9 GHz (or that is the reported speed), instead of 4.0 GHz... I wonder if this has changed, and explains the sudden loss of performance? Overheating would cause the CPU to downrate... Is there a way to easily track the actual CPU speed during the test (short of running the benchmark inside and outside the fridge)?
I don't know if this will work but have task manager open and Lightroom not in full screen?
 
I don't know if this will work but have task manager open and Lightroom not in full screen?
@Danielx64,
I’ve been running Lightroom that way — which is why I know the CPU utilisation runs between 50-70%, and that it only accesses 4-6GB of RAM (and tells me 80% of RAM is used). I’ve been focusing on trying to find other applications/processes/services running, and haven’t thought about watching the performance monitor screen while running the benchmark.

As long as Lightroom remains in focus, the script should keep running, it may just be a bit slower...

I will give that a go — unfortunately running the benchmark takes a bit longer than the quoted 30 minutes on my machine :)

Christelle
 
Adobe made some changes a while back that allowed more parallelism, but they only kicked in somewhere above 4 cores, I'm not sure if 6 or 8 (8 definitely, but I couldn't test 6).

This is relevant to those with Intel that have Hyperthreading available. Hyperthreading is a tool to make some Intel's appear to have twice the core count. If I turned it off (giving me apparently 4 instead of 8) a lot of things changed, notably during import and preview build, making them much slower.

Where people say "something changed" it is worth checking if that switch got changed in your bios/uefi.

Note I didn't look up any of the ones mentioned to see if they have hyperthreading.

Note: Hyper Thread, not Hyper V, different things.
 
Adobe made some changes a while back that allowed more parallelism,
You may already know Ferguson, but sometimes parallelism is implemented in the software and not hardware. One if the challenges for software that runs on different CPU's/GPU's, is how much custom code they need to write per platform. One company I worked for started, and continued to develop, software on the Windows platform. When they needed to create an offering for Unix, the used a UNIX Windows emulation library for C++ that translated Windows call to Unix equivalents. Optimum? Not likely but fortunately good enough. I have no idea now Adobe decides on common and per platform software development.

Benchmarks are not generally used for diagnostics but in decision making for a new acquisition. If you are interested in performance diagnostics, follow some of the links for LR performance already listed on this thread or look at the Windows Performance Analyzer (WPA) at Windows Performance Analyzer. Full disclosure, I have not used the WPA.
 
@Paul_DS256, the change I am talking about was specifically that Adobe implemented different algorithms with different core counts. I do not mean just that it worked better. For example (and I have not looked at this since 7.mumble so not sure if it changed) if you import and build previews with 4 cores, it would import, finish importing, then start building previews. With 8 cores it would kick off the preview build before it finished importing; indeed it would kick it off several times if the preview build finished (the set it had at the time) before import finished.

I.e. LR actually did different things, not just different speeds.

The issue with windows performance diagnostics is that it is very hard to reliably reproduce certain behavior in LR. Brush strokes, or scrolling between shots in develop are good examples. They occur briefly when done by hand, and it is hard to see what resources in windows were constrained while they were done. It is also hard to change something and reproduce the same scenario in LR to see if it got better, because of human mouse movements and typing involved.

What this tool does that I find of interest is less its measurements, but that they have scripted all that, so you can run things over and over changing some setting or resource, and see the impact. What would be REALLY nice is if I could run just one sub-test over and over. (Hint, hint if Puget is reading.).

Trust me, I've been bit by more benchmarking and performance measurement issues than I can even remember; computers were my day job, Photography just a hobby. And this is not an ideal tool. But it is a tool I didn't have before.
 
The issue with windows performance diagnostics is that it is very hard to reliably reproduce certain behavior in LR. Brush strokes, or scrolling between shots in develop are good examples
Completely agree Ferguson. WPA is not instrumented into LR so you can only get the results for all of LR.

However, what you can do is plan your test. For example:
  1. Start LR and warm it up. e.g. go between the different modules you will be testing. Leave it in the module you will be testing e.g. DEVELOP
  2. Start WPA tracing
  3. Note the time and allow a one minute quiet period. This quiet period you should be able to see in WPA.
  4. Perform a specific test in LR. For example,
    1. Increase HIGHLIGHT 5 times.​
    2. Decrease HIGHLIGHT 5 times.​
    3. Stop​
    4. Note time and allow a one minute quiet period.
    5. Perform WPA analysis
Not ideal but comes down to what you are trying to accomplish and if the Pudget benchmark gives you anything outside of what you can already get. My concern is red herrings.
 
Good morning everyone!

My second set of results (generated while Resource Manager was active) look quite a bit different (and better).
I had to restart the benchmark after if died before exporting the D850 files (it completed the Canon and Sony files, but got stuck, and stopped running).

The following conditions were different during the two tests:
-- Task manager / resource manager window was open during the second test, with the CPU performance graph active
-- The laptop was not connected to the internet during the second test. I didn't disable WiFi, I just didn't connect to the phone's hotspot (we're travelling), while the first test was done while the laptop was connected to the internet (NBN) through the home WiFi router. In both cases Windows virus protection / Windows Defender have been disabled.
-- It has been about 10 degrees C cooler in the room while running the second test.

Some observations during the benchmark (I went to sleep after restarting it):
-- During most of the time, performance seem to be processor limited.
-- The maximum memory used was 9.3 GB, and it stayed relatively constant.
-- During building Smart Previews, the hard disk was the apparent limitation. During this time, the system used about 70% CPU at 1.5 - 1.7 GHz while the hard disk use was up around 30% of capacity.
-- During the CPU-intensive tasks, some strange, saw-tooth behaviour occurred, as if something was limiting or throttling the CPU, without an apparent uptick in disk activity.

1577474374388.png


I am not sure (yet) how useful the tool will be in comparing different systems (or selecting hardware for purchase), but it is an extremely useful way for me to run a set of standardised actions and measure the outcomes to try and diagnose whether I'm having hardware or software issues, and whether I can make a significant impact by changing some settings.

I need to investigate the cores / logical processor issue further. I also need to find out if Microsoft did something else to throttle the CPU during the last updates (like they did earlier in the year).

And I seem to get almost 50% more performance out my laptop if I work during the night....

Christelle
 
During the CPU-intensive tasks, some strange, saw-tooth behaviour occurred, as if something was limiting or throttling the CPU, without an apparent uptick in disk activity.

The two “screenshots“ show the CPU graph I mentioned. The top (first) one shows it at 37% at 3.27 GHz, while the bottom one shows it at 66% at 2.03 GHz
36594F81-999F-4289-82A7-03CBB0535EFB.jpeg


5486ACDE-79B4-48AB-B5E7-8AB3D24F6EFC.jpeg


During the Auto White Balance and Tone, the CPU utilisation went down to 15% — with nothing else taking up the slack...
156D7B13-2C5E-49F6-9FB8-FE92329D102B.jpeg
 
This is relevant to those with Intel that have Hyperthreading available. Hyperthreading is a tool to make some Intel's appear to have twice the core count. If I turned it off (giving me apparently 4 instead of 8) a lot of things changed, notably during import and preview build, making them much slower.
AMD also has their own version of Hyperthreading as well.

Also some newer Intel CPUs don't have Hyperthreading, therefore you are losing quite a bit of performance compared to older models.
 
Also some newer Intel CPUs don't have Hyperthreading, therefore you are losing quite a bit of performance compared to older models.
Well, maybe. TANSTAAFL. If you have real X cores, but your workload is generally <= X, turning HT off is usually better, as it does not have the scheduling overhead of HT'ing.

But with Adobe predicating enabling certain performance features on having above a certain number of cores, turning it on is probably a good thing for LR.

Incidentally, some of the esoteric side-channel and execution anticipation bugs (Spectre, zombieland, etc.) had fixes that manufacturers put out that likely slowed down lightroom for some users. Hyperthreading for a while was on the chopping block for zombieland, but I doubt any home users noticed (though it may or may not have changed some OEM's shipping defaults in bios/uefi). So some people who think over the last year their system got slower -- well, you may be right. More secure, maybe (but frankly these were very low risk items for home users), but slower. Welcome to the world where hackers are winning the war.
 
While I understand that sentiment in some ways, almost any consistent benchmark can be used as an investigative tool.

[snipped]

After all, we almost quite literally have nothing at present.
Agreed. I think it's only a matter of time (and the end of beta status) until Puget publishes benchmark results for the systems they sell. And modify their custom order page to give you an approximation of the benchmarks for the configuration you have selected. If they don't I for one will be surprised.
 
I am. But I am traveling right now, and the results are on my desktop.
As I recall, my scores (4 runs) were about 730, with a std of 3%.
This is on a desktop with specs similar to what Puget Systems recommended about 2 years ago.
I'll post my results on this Sunday after I return.


Jim
For what it is worth, I am posting my results from a desktop.
I had misremembered the numbers I wrote earlier.
The average of 4 runs, two with the resource monitor running, and two without was 719.5 with a std of 1.4%

The first run was
LrBenchResults_12-17-23-00.jpg

and was the highest of the four; no resource monitor.
The run closest to the average is
LrBenchResults_12-18-11-03.jpg


with the resource monitor active, though I can't see any meaningful differences.
I shoot Canon, a D90 with 32 MP & CR3 files, though shot CR2s oreviously.
It isn't clear to me why the creation of HDRs and Panos have such a low avg score .
Is this my GPU? possibly.

Jim
 
Status
Not open for further replies.
Back
Top