NVMe SSD Speed w LR in Develop Module

Status
Not open for further replies.

GregJ

Greg Johnson
Joined
Jul 11, 2011
Messages
647
Location
San Antonio, TX
Lightroom Experience
Power User
Lightroom Version
Cloud Service
There were several recent threads where we talked about the behavior in LR where when we first upload a raw by clicking on it in the Develop Module, we see a slight lag that is the same no matter how fast the drive is that the raw is stored on. There is a huge speed and performance difference between a spinning hard drive on an external drive and an M.2 NVMe PCIe 4 SSD and an old spinning drive. Yet, we don't feel that difference when LR first accesses that raw in the Develop Module. (Note: The Develop Module is not using the preview file like it does in the library module, so the action happens quicker there).

A lot of us were speculating that Adobe was probably working on this and that it had to do with the program somehow not making best use of the latest gen SSDs and all of the remarkable capability they provide. That is common on a lot of productivity programs these days and is a problem with the big high-res games too.

I saw an article in Tom's Hardware this morning that I think could be related. It is technical but has to do with a new firmware coming soon to Gen 4 SSDs, bringing DirectStorage-class performance to the masses, and allowing games and productivity programs to access NVMe SSDs that are PCIe gen 4 at a vastly faster speed for larger files (like a raw). Currently, SSD controllers are optimized for rapid-fire small access and speed and not bigger gulps, like LR has to do when first accessing a big 50-100 MB raw files. This could change that and do both really fast without sacrificing one for the other.

I could be wrong, but I bet something like this is going on with LR. Worth a skim.... If SSDs get this firmware (and they will) and Adobe optimizes LR for it, raw files could load in the develop module for the first time at warp speed (instantly). This is interesting even if this is not done by Adobe because it points out some of the problems productivity programs like Adobe have with optimizing for this deluge of fast drives that are evolving. This ain't like the old days where everything was running off of spinning hard drives that went years without changing speeds, allowing software vendors to rest on their haunches.

https://www.tomshardware.com/features/the-directstorage-advantage-phison-io-ssd-firmware-preview
 
There were several recent threads where we talked about the behavior in LR where when we first upload a raw by clicking on it in the Develop Module, we see a slight lag that is the same no matter how fast the drive is that the raw is stored on. There is a huge speed and performance difference between a spinning hard drive on an external drive and an M.2 NVMe PCIe 4 SSD and an old spinning drive. Yet, we don't feel that difference when LR first accesses that raw in the Develop Module. (Note: The Develop Module is not using the preview file like it does in the library module, so the action happens quicker there).

A lot of us were speculating that Adobe was probably working on this and that it had to do with the program somehow not making best use of the latest gen SSDs and all of the remarkable capability they provide. That is common on a lot of productivity programs these days and is a problem with the big high-res games too.

I saw an article in Tom's Hardware this morning that I think could be related. It is technical but has to do with a new firmware coming soon to Gen 4 SSDs, bringing DirectStorage-class performance to the masses, and allowing games and productivity programs to access NVMe SSDs that are PCIe gen 4 at a vastly faster speed for larger files (like a raw). Currently, SSD controllers are optimized for rapid-fire small access and speed and not bigger gulps, like LR has to do when first accessing a big 50-100 MB raw files. This could change that and do both really fast without sacrificing one for the other.

I could be wrong, but I bet something like this is going on with LR. Worth a skim.... If SSDs get this firmware (and they will) and Adobe optimizes LR for it, raw files could load in the develop module for the first time at warp speed (instantly). This is interesting even if this is not done by Adobe because it points out some of the problems productivity programs like Adobe have with optimizing for this deluge of fast drives that are evolving. This ain't like the old days where everything was running off of spinning hard drives that went years without changing speeds, allowing software vendors to rest on their haunches.

https://www.tomshardware.com/features/the-directstorage-advantage-phison-io-ssd-firmware-preview
@GregJ

Why should Adobe, which clearly sells application programs that run under both Windows and MacOS, be that concerned about driver-level issues. Of course, if you have an SSD or better yet a PCIE-4 NMVe drive in your system, Lightroom and all other programs will run faster when accessing files stored on the SSD or NMVe drive.

But absent specific, detailed information from Adobe (and those who know can't say) I think the correct assumption is that Adobe is not concerned with drive-level issues, except for GPUs of course. Considering the large number of both first-party and third-party storage options available on both Windows and MacOS, a concerted effort by Adobe would suck a lot of development resources and defocus Adobe from adding new features and fixing bugs.

Do you know anything specific about Adobe and non-GPU driver issues? Or as you say in the post I quote, you were "speculating? By the way, who is "a lot of us?" I honestly don't think that this forum is the right one for such speculation, and might cause some confusion. Tom's Hardware, [H]ardForum, Tech Republic, AnandTech, Reddit subforums, and the boatload of overclocker forums out there are better for this question.
 
We are going to have to agree to disagree on that Phil. We will see over the next year as there are big movements in the hardware arena this year and next by Intel, AMD, Nvidia, Apple and all of the RAM and flash drive makers. It is very interesting, and you can bet Adobe is all over it. Adobe is working hard at getting better and faster and utilizing as best they can the latest computer and camera gear that photographers use in their work. They are often hampered by old code and structures that have to be rewritten and fitted in (as with all software that has been around for a while) and like Victoria said, it was pretty hard work rewriting to get LR to use the GPU on exports. I'm sure they will continue to make Adobe better for us. And you can bet they want to make better use of more cores, the latest GPUs and read/write performance from the latest drives.
I think this thread topic was relevant to discussions that have occurred on at least 6 threads the past 3 weeks.
That said, if the staff here thinks that posting something like this is beyond the scope of the forum or just a waste of space, then it should be deleted.
I thought it was interesting and relevant, but please delete the thread if it is deemed to be not of interest since I posted this without asking a question.
 
It's in the lounge, it can run. That said, Phil's right, Adobe generally focuses on optimizing for typical spec rather than the fastest money can buy.
 
We are going to have to agree to disagree on that Phil. We will see over the next year as there are big movements in the hardware arena this year and next by Intel, AMD, Nvidia, Apple and all of the RAM and flash drive makers. It is very interesting, and you can bet Adobe is all over it. Adobe is working hard at getting better and faster and utilizing as best they can the latest computer and camera gear that photographers use in their work. They are often hampered by old code and structures that have to be rewritten and fitted in (as with all software that has been around for a while) and like Victoria said, it was pretty hard work rewriting to get LR to use the GPU on exports. I'm sure they will continue to make Adobe better for us. And you can bet they want to make better use of more cores, the latest GPUs and read/write performance from the latest drives.
I think this thread topic was relevant to discussions that have occurred on at least 6 threads the past 3 weeks.
That said, if the staff here thinks that posting something like this is beyond the scope of the forum or just a waste of space, then it should be deleted.
I thought it was interesting and relevant, but please delete the thread if it is deemed to be not of interest since I posted this without asking a question.
Greg,

My thinking about these issues starts with the business of producing and selling software.

Almost all of my career was in product management and product marketing, mostly software but also some hardware, all with Silicon Valley companies. Thanks to the boom-or-bust situation of most of these companies, I had the "opportunity" to work for a variety of companies, startups, small, and large . Some of my employers were listed on NASDAQ or NYSE, and some were privately held companies. I worked in both "infrasturucture" products (TCP/IP layers 1-4), and "application" products (TCP/IP layers 5-7). Besides TCP/IP, I also worked on communications products for OSI, PC LANs (Novell Netware), IBM's SNA, and DECNet and information security/cryptography.

And if nothing else, it was true in all companies that major decisions about adding new features into the product required justification, usually with a business case. And a key part of the business case was being able to demonstrate revenue growth by adding new features, creating or increasing "competitive differentiation," being "best of breed," and building on the brand and "brand promise." Most companies, except for some startups, have learned the hard way that adding new features because "it's fun to work on them," or they are "cool", is bad business.

For TCP/IP layers 1-4, Adobe certainly uses a privately branded established cloud service provider, perhaps Amazon or Microsoft. Adobe's "brand promise" for Lightroom cloudy is all about photo organization, storage, and editing. We don't hear a word from Adobe about how their cloud service is "better" than a competitor's.

For AMD and Intel, it is important to support new storage hardware with chipsets and drivers, because such work is integral to competitiveness and building up the brand promise. Similar considerations apply to AMD's GPU business and NVidia, for PC bus speeds and memory types, or to Microsoft, Apple, and commercial Linux vendors for operating systems. However you think about the market none of these companies compete with Adobe, and no one thinks of any of these companies as Adobe competitors.

For Adobe, it is important for the brand to leverage GPU cards to increase performance, and there is a good business case for that, which is made easier because there are a limited number of GPU manufacturers. And from what I have seen, Adobe seems to focus on NVidia. That effort supports the Adobe brand with increased speed for key DEVELOP functionality.

However, we have never seen Adobe try to improve Lightroom or Photoshop by providing improved performance for specific models of printers, or monitors, or mice, keyboards, and tablets (e.g. Wacom). Such efforts could involve hundreds or even thousands of products. Adobe efforts in this area would be scattershot at best, and would not really build up the Adobe brand. Nor does Adobe sell ink or paper for photo printers, or picture frames.

Look at this web page. https://www.adobe.com/careers/teams/engineering-and-product.html, particularly for Product Engineering. Do you see anything about Product Engineering working on low-level storage devices?

If you are so inclined, download the source code for GIMP https://download.gimp.org/mirror/pub/gimp/v2.99/ See if you can find any code for NMVe or SSD drives.

The idea that one company has to build all its own hardware and software for a new product ended in 1981 with the introduction of the original IBM PC. Today, Apple which is almost unique by building its own CPUs, still uses many third-party components for its Mac systems and iPhone/iPad products.
 
Phil, that is very interesting. You spent a lifetime in the business, so you forgot more yesterday than I'll ever know about software structure, coding and what the big guys like Adobe tool their products for in the case of emerging hardware.
Those whole discussion came up because several Gurus and enthusiasts on the forum noted that the develop module has that little noticeable lag when it loads a raw up front and then builds that 1:1 preview. Even on the fastest rigs (like mine), one notices the 2 second little gulp for air. But the most amazing thing is that apparently the Gurus and stopwatch holders notice no difference between that behavior of loading the raw files when they are stored on an external old spinning HDD vs the fastest external or internal SSD. That is pretty amazing and what started this whole line of discussion.
As you know, the humble (and cheap) SATA SSD can read and write data 4 times faster than the fastest HDD. 4X is big, but it feels like 15x because the HDD's higher latency contributes to the night and day feel between the two too. The SATA SSD feels a gillion times faster than an HDD, niot just 4X faster. So that delay adds to the feeling of sluggishness of the HDD.
That is not in play when comparing different classes of SSDs. SATA SSDs have a max throughput of 600MB/s. PCIe 3 is 3,500 MB/s. PCIe 4 is 7,500 MB/s. PCIe 5 is 13,000 MB/s. So going from an HDD to a simple SATA SSD gives you a huge leap of magnitude in feel for any task. Going from the SATA SSD to the current most common sweet spot of PCIe 3 is a jump again of a magnitude of 5x, yet you don't notice it as much. Then to PCIe 4 (which is getting common) is another doubling of the speed. Then PCIe 5 SSDs will double it again. By the time you get to PCIe 5 SSDs you are 25 times faster than on an HDD (depending on the task and how you count it, but it is a huge difference).
Yes indeed - it's a big deal. And what I'm saying is that I bet you that Adobe is on this big-time. If you say they are not, I disagree.
They don't want their develop module feeling like it is running off on an HDD. It is a big deal to them. They know this, and they are working on it.
I don't know much, but I know that.
 
Looks to me like this debate is based on theoretical speed rather than real world usage. Feel free to test all these various permutations so there’s some real proof of the benefits

The speed of the storage is not the only factor in all of this. They‘re building for the masses, not to eek out the best performance from the cutting edge hardware only a few customers own, so they focus on things like caching to reduce the load time especially when working through images consecutively, which benefits even slower hardware.
 
Victoria, they are not building for the masses in my opinion. Depends on what you mean by the masses. The masses shoot on phones. Photographers who shoot raw are not the masses. Photographers who shoot raw also have computers that boot off SSDs, use GPUs and have their images stored on SSDs.
So, if like the gurus noted on several threads, if the develop module behaves the same loading raw files from an external spinning HD that it does with disks that are twenty times faster, then that is something they can improve on.
I'm sure they are working on that. Why would they not? You said Adobe's priority was to make LR faster, more responsive and more efficient.
It is plenty fast for me now and I have a nice desktop rig and laptop. But if the develop module loaded a file for the first time faster, I would not complain, and if it used the GPU for more functions (like building previews) that would be good. But it is not a big deal.
 
Victoria, they are not building for the masses in my opinion. Depends on what you mean by the masses.

By the masses, I mean the majority of Lightroom users, as opposed to those with the unlimited cash and expertise to build the ultimate rig.
 
Status
Not open for further replies.
Back
Top