• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Internal Catalog Changes (was: Edit history and catalog size)

Status
Not open for further replies.

Hoggy

Never take life, or anything in it, too seriously.
Joined
Nov 20, 2012
Messages
527
Location
Wisconsin
Lightroom Experience
Advanced
Lightroom Version
Exact Lightroom Version: Classic 7.01

It's likely that the new catalog version hasn't been used by many extensively as of yet, so this may still be an unknown. Of course, I also realize the limited usefulness of keeping edit histories - but this is a little more towards hypothetical/academic. (Although it could mean that I might be more likely to keep a few more 'useful' edit histories on some images than in pre-7.0 versions.)

As many may have noticed, their LR catalog pre-7.0 has been reduced quite a bit, post-7.0. Mine is over half reduced - from ~1.2GB to ~512MB unzipped.

I also have been recompressing the zip backups to 7zip-Zstandard codec, level 17. Pre-7.0, they would recompress from a ~280MB (IIRC?) zip file to an ~86MB 7zip file. Post-7.0, they recompress from a ~260MB zip file to a ~154MB 7zip file. So it seems as if 7.0 might be writing compressed data directly into the catalog file, leading to poorer recompression to 7zip-Zstandard-level17. Of course, that's just a guess as to what might be happening in order to explain those findings (therefore I'm open to other possible reasons to explain that, btw).

Sooooo.. Has anyone done any testing to see if keeping long edit histories bloats the catalog as much as pre-7.0 edit histories did? Or otherwise have any 'direct' knowledge if they should bloat it less now, post-7.0?
 
As part of the work optimising performance, Adobe now compress the History log data and I would not expect the LR7 catalogue to grow as quickly as the LR6 equivalent. History compression during catalogue upgrading is the main reason why LR7 catalogues are smaller than LR6, and half does not surprise me (it is consistent with when I cleared a big LR6 catalogue's History). Your observations about zipping backups are also what I would expect, since the History is already compressed.

John
 
History compression during catalogue upgrading is the main reason why LR7 catalogues are smaller than LR6, and half does not surprise me (it is consistent with when I cleared a big LR6 catalogue's History).

Thanks for confirming my observations/suspicions. But as far as the quoted statement here, there must something else quite big that it's compressing. As a rule, I always clear history data after making a snapshot except maybe for current in-progress editing - so there would be practically no history there unless I forgot on a couple/few images.

However, each non-tiff/psd image has at least one 'v1.00' snapshot, and there may be an average of 2 or 3 on most images (~70%) - with some rare ones that might have about 10 snapshots (maybe ~6 images in total there). I always make it a point to try and delete redundant/now-pointless snapshots as I make my way though my entire 'artsy' catalog a 4th/5th or 60th time :cool2: in order to keep the size low. ..... But even solely considering snapshot compression, I'm quite shocked that it could have accounted for that much of a decrease. I think there was less of a decrease in size when I first started deleting all my [long] edit histories. - but it's been quite a while since I got done with that, so I could be remembering that wrong.

(ETA: oh.. This 'artsy' catalog is only ~3000 images - so probably quite small compared to others'.)
 
Last edited:
My guess is that the difference is because upgrading also includes the standard optimisation which executes a SQL vacuum statement, cleaning up any temporary or orphaned data.
 
My guess is that the difference is because upgrading also includes the standard optimisation which executes a SQL vacuum statement, cleaning up any temporary or orphaned data.
This is good to read. In the past, I have executed my own SQL vacuum statement on recalcitrant catalog files to achieve this same end,
 
Optimise definitely includes vacuum, Cletus. I am not as sure about what else it may do.
 
I compared old and new DB version with SQLite browser. Development history and XMP content are compressed now. In LR6 they were stored as plain text string, now they are compressed binary data.
 
I compared old and new DB version with SQLite browser. Development history and XMP content are compressed now. In LR6 they were stored as plain text string, now they are compressed binary data.

That leads the techno-geek in me to wonder now, about which codec they're using. It would seem to me that Zstandard would be ripe for LR to use internally, due to it's ultra-crazy-fast compression & decompression (yet superb ratio). I suppose I might have to finally check out one of the free SQL browsers I've downloaded recently to see if it can be figured out... I've long been back-burner curious about SQL, anyways.

Although, I must say, I have mixed feelings about writing compressed data directly into the database. On the one hand, it obviously makes the 'uncompressed' catalog size much smaller. But on the other hand, I've been enjoying recompressing them (via unzip then solid-7zip) with various codecs so as to make the best use out of my free space with Google Drive - while at the same time trying hard not to abuse that free space, because this 'po-boy' depends heavily on that free space :). By playing around with the command-line options of 7-zip[-Zstandard], I've been able to deep-freeze older catalogs down to 48MB (of 1.2GB original) by using optimized options with LZMA(1)-level-9ultra codec.

Ugh! I suppose I'll have to go through some more rounds of researching and testing various options of various codecs to see if some might do better with the new post-7.0 compressed-uncompressed catalog format. Not looking forward to that, though, as I just got done with that a few months ago - and have already set up several menu-button scripts in Directory Opus for it.

Just when you think you have something figured out.. BOOM - technology goes changing on ya. Will it ever end? :rolleyes:
 
Last edited:
Although, I must say, I have mixed feelings about writing compressed data directly into the database. On the one hand, it obviously makes the 'uncompressed' catalog size much smaller. But on the other hand
Personally I think this was a solution in search of a problem. It was handy (for those who like to query in SQL) to have the history there, often I have looked for a specific event (e.g. preset applied, etc.) in the history, now that's not happening.

My uncompressed LR6 catalog was 3.1GB, my LR7 catalogs about 1.7GB. Evening ignoring regular optimization the compression saved at must 1.4GB.

My preview cache is 75GB.

My images are 1700GB.

Seriously -- what's 1.4GB? (for those not interested in doing arithmetic it's 0.079% of my space; it's not even a rounding error).

Now they have to uncompress and recompress it all the time, plus it's useless to outside queries. Databases do NOT get significantly slower because they get bigger if they are properly designed, and I would be amazed if the CPU tradeoff for a bit smaller file was a performance improvement in fact rather than theory.

Wish they would have spent that time on other performance issues, personally.

-- Signed grumpy old programmer.
 
I don't think Adobe owe anything to those who want to dig around the SQL - they've never promoted it as a way to get information - and it's certainly not fun to parse the Lua statements that are often returned, is it?

There was a problem with history data corruption too, though I didn't believe it until I had a case where 10 years of history data appeared to cause a slowdown. I had tried everything I could think of, and clearing history was the last resort. Performance was restored, and the halving of the file size was a handy by-product.

John
 
I don't think Adobe owe anything to those who want to dig around the SQL - they've never promoted it as a way to get information
Well, I'll agree they are not interested in making it an open system all that much.
 
Well, I'll agree they are not interested in making it an open system all that much.
Makes it that much harder for a future product that claims to import and process LR edits.
 
Makes it that much harder for a future product that claims to import and process LR edits.

OH, that's a good point to consider... Hopefully they don't also write compressed XMP to the files themselves when doing a 'write metadata to files'. I would hope..
 
OH, that's a good point to consider... Hopefully they don't also write compressed XMP to the files themselves when doing a 'write metadata to files'. I would hope..
I don't think that compressed data is allowed in the XMP standard.
 
Well, I'll agree they are not interested in making it an open system all that much.

They are, or at least were. That's why they have the SDK.

Hopefully they don't also write compressed XMP to the files themselves when doing a 'write metadata to files'.

The history log has never been written to the xmp, and it wouldn't be much value in another app. On the other hand, unlike competitors, Adobe has always written adjustment instructions to xmp in a form that could be interpreted by another app. So rather than trying to make Adobe the bad guys on this.....

John
 
OH, that's a good point to consider... Hopefully they don't also write compressed XMP to the files themselves when doing a 'write metadata to files'. I would hope..

Come on, it takes 10 seconds to verify this. XMP files are still clear XML text.
 
The history log has never been written to the xmp, and it wouldn't be much value in another app. On the other hand, unlike competitors, Adobe has always written adjustment instructions to xmp in a form that could be interpreted by another app. So rather than trying to make Adobe the bad guys on this.....

That's good to hear.. I'm not concerned about history - just current develop settings and snapshots.

Come on, it takes 10 seconds to verify this. XMP files are still clear XML text.

I have a disability that causes some 'issues' with me easily checking such things.. And it's been a very long time since I've done that, so I can't remember which program I used anymore (PhotoME, perhaps?). Also bear in mind that I don't use separate XMP files - just embeds in DNG.

Do you know of an easily understood, still-being-developed/recent GUI type program to check that myself? I mean, I know there's ExifTool - but that might take some good time for me to figure out how to use, and I haven't had a pressing need for it yet. (Although I still have my original intelligence.. It just doesn't work properly or quickly anymore. :) )
 
Do you know of an easily understood, still-being-developed/recent GUI type program to check that myself?
If you have a .XMP file, Notepad works just fine.
 
If you have a .XMP file, Notepad works just fine.
Search for "microsoft XML notepad." It's from 2007, but it works fine on my Win 10 64 system.

Phoil
 
Status
Not open for further replies.
Back
Top