• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

Test your catalogs (and backups) - How To

Status
Not open for further replies.

Linwood Ferguson

Linwood Ferguson
Lightroom Guru
Joined
Jan 18, 2009
Messages
2,587
Location
Fort Myers, FL
Lightroom Experience
Advanced
Lightroom Version
Classic
Lightroom Version Number
LR 7+ (Classic)
Operating System
  1. Windows 10
Lightroom uses SQLite as its database, so your "catalog" is a big SQLite database.

Lightroom will back that up, and test its integrity, and optimize all on its own. That's good. Do it. However, it is never wise to trust any one program, what follows is a very brief outline of how to test your catalog for SQLite integrity. This MAY find errors Lightroom misses and indicate corruption (and almost certainly it is real corruption). This MAY miss lightroom-specific tests that Lightroom does -- it is possible for the database to have perfect integrity but have bad data in it that Lightroom would finds in ites test. In other words, doing both is a goodness -- both should pass. If either fails, you have issues.

Also, and perhaps most importantly -- TEST YOUR BACKUPS. Because otherwise you never know if they are broken. This gives you a quick way to tell if they are valid SQLite databases.

To test your catalog:

Step 1: Download command line utilities for SQLite from the authoritative source: SQLite Download Page You want to get the "Precompiled Binaries for Windows" and specifically the "SQLite Tools" command line bundle, or the same for Mac OS (no real choices there). You do not want the windows "DLL".

Note from here on my comments are for windows -- I don't speak MAC but I think they are more or less identical except for terminology perhaps.

Step 2: Unzip the downloaded zip file, and put the file "sqlite3" somewhere convenient, like on your desktop (it ends in .exe on windows, I think no add-on in Mac).

Step 3: Figure out where your Lightroom catalog is, the full path/name. open up the Edit, Catalog Settings to find it, e.g. in my example below the path is "T:\LightroomCatalog\LR7.lrcat". You need to know that whole thing, and it may be best to put it in quotations. Then close lightroom ( you can not check integrity with lightroom open).

catalogLocation.jpg


Step 4: Open a command prompt (or whatever they call it in Mac), and enter first the path to where you put "sqlite3" and the catalog name and the rest of this line as shown. Case may matter. You should get "ok", or a list of problems (or if you type something wrong hopefully an error you can make sense of and type it right). My sqlite3 was on my desktop, so I put "desktop" in first, which is where windows has it.

ok.jpg


That's it -- if you get OK, you are good. If not... worry, maybe start a new thread here (or reply here if you cannot get it to run at all).

TEST YOUR BACKUPS NEXT.

Step 1: Backups are zip files also (at least recent versions are). So first unzip one and put it SOMEWHERE ELSE. DO NOT UNDER ANY CIRCUMSTANCES PUT IT OVER TOP OF YOUR REAL CATALOG. **** PUT IT SOMEWHERE ELSE ****. So I dragged mine to my desktop, here's how it looked on Windows to do so. Notice that I had the zip file open looking at it, and dragged the contained LR7.lrcat file onto my desktop. BE SURE YOU PUT THIS SOMEWHERE OTHER THAN YOUR CATALOG, SOMEWHERE YOU CAN DELETE IT WHEN DONE. Clear? Putting it on top of your existing catalog wipes it out, do not do that. If you do it, do not tell me you were not warned!

Copy.jpg


Step 2: Now run the exact same command as above, but on the backup file instead where ever you put it. Here I used a catalog that I knew was bad so you can see what kind of info a failure shows. The actual errors are not really helpful to most people, but that you get any means that backup is BAD and cannot be used to restore. Note it's always a good idea to test integrity before trying to do a restore, but it is much more useful to test integrity BEFORE you need a restore so you can get a better backup!

bad.jpg



So again...

If you test with SQLITE3 and it says it is bad -- ti is bad. Get help. If it says it is "ok" it MAY be OK, continue testing (and backing up) with Lightroom as well. Do both.

And test your backups -- Lightroom NEVER tests your backups, unless you need to restore, and then it is too late.

Linwood

Ps. This should work for older lightroom's a swell, SQLite generally adjusts downward to old versions. Very old LR backups are not zipped, just hit the file directly if not.
 
Last edited:
Thanks, this got me to finally try out one of the free sqlite apps I had downloaded - especially seeing as how recent versions of LR are reported to corrupt catalog backups. ... And it now seems that I'm one of them. :(

However, I used one of the graphical utilities I had downloaded a while ago - but never installed.
For those interested, it's called "SQLite Expert Personal" - found at: SQLite administration | SQLite Expert
Direct download: http://www.sqliteexpert.com/v5/SQLiteExpertPersSetup64.exe (as LR only works with 64bit these days)

After installing, choose menu items: "File -> Open Database", then choose "Database -> Check".

For me, trying it out on the main catalog returned "ok", but checking backups returned a long list of errors. I got errors for all catalog backups created starting when LR first turned into LR Classic 7.0. Only LR6/2015 backups returned "ok".

Since I seem to be affected by the backup corruption bug, I need to find another alternative for catalog backups.
 
Last edited:
Since I seem to be affected by the backup corruption bug, I need to find another alternative for catalog backups.
You don't need a tool. Just close lightroom, and copy (not cut) the catalog and paste it into another folder somewhere. It's just one file, there's no magic in the backup. But do it only with lightroom closed.
 
You don't need a tool. Just close lightroom, and copy (not cut) the catalog and paste it into another folder somewhere. It's just one file, there's no magic in the backup. But do it only with lightroom closed.

I'm a bit of an automation and compression nut. :) So I want to create a button in Directory Opus to automatically copy and compress with 7-zip Zstandard. Or setup a separate task in Cobian Backup - but I'm not sure if the 'create shadow copy' functioning will work correctly if a catalog is open - I'll have to test that out.

But in the meantime, I'll just do the manual route. I haven't been using LR a whole lot anyways lately, since this computer is so slow.. I'm waiting till I get a new one soon.
 
Me too: main catalog OK, backup catalog not OK. Disappointing ...

Yeah, hopefully it gets fixed soon as it's clearly a bug, but the good news is awareness is good interim fix. Also, everyone SHOULD be backing up their catalog in their normal disk backup process, and those will not be affected by this.
 
Also, everyone SHOULD be backing up their catalog in their normal disk backup process, and those will not be affected by this.

This is what I have always done and is the recommendation that I give when asked. I do this every time I add photos, or post process a bunch.
I have never trusted compressed data bases as good backups.
 
I have never trusted compressed data bases as good backups.
:cautious:

Although it's not the compression that's doing it.. Stable compression doesn't do this sort of thing. Even a copy operation can screw it up, if the code to do it got messed with. And in this case, we don't know if the code that got messed with was the copy, the compression, or both. I even recompress the zipped catalogs using a 7-zip Zstandard codec, and they contained the exact same errors at both ends. However, we do know Adobe's track record with these sorts of things - so therefore I'll go put my dunce hat on and sit in the corner. :bag:

Ehh, at least I found out about it before I needed one of the backups.
 
Last edited:
.... But this is one very good reason to always do a 'save metadata to files', which I do religiously.
 
Although it's not the compression that's doing it.. Stable compression doesn't do this sort of thing.

I wouldn't assume it's not the compression, indeed I'm pretty sure it is. Of course lossless compression done right does not cause this, but if one makes mistakes implementing lossless compression...
Ehh, at least I found out about it before I needed one of the backups.
One reason for the post, to try to encourage people to find out early.
.... But this is one very good reason to always do a 'save metadata to files', which I do religiously.
Yeah, a mixed blessing, as it preserves some info, but loses a lot of other (e.g. I have tens of thousands of photos in published collections that get lost, and are using more and more virtual copies). But better than nothing by far, indeed.
 
Thanks a lot for this great howto!!
Is there a known backup corruption bug in Lightroom 7.3?
Where can I read more about it and other known bugs?

Thommy
 
Is there a known backup corruption bug in Lightroom 7.3?
Where can I read more about it and other known bugs?

Yep, there is - but I'm not sure if it affects everybody though. Those that are seeing the issue, including myself, seem to be seeing it in ALL backups since LR 7.0! I don't know where to see it officially, but I first heard mention of it via John Beardsworth and others on Luminous Landscapes.


I have never trusted compressed data bases as good backups.

Actually, there are compressors that add recovery information to the compressed archive. Albeit I don't use those, an argument could be made that they're even a tad more secure than not compressing. (Discounting things like PAR/2 recovery blocks.)

However I think we can both agree that we wish Adobe didn't force compression on us. Both internally since 7.0, nor externally via zipping since 6.0. But for very different reasons. :)

First, the forced external zipping makes us both wait while it zips. And for me, the internal compression makes it so that I don't get as good recompression when when I redo their forced zipping using more efficient methods, while being able to continue using LR. Before their forced internal compression, by using special options with 7-zip using lzma/2 codec, I was able to 'deep-freeze' a ~1.2GB catalog down to roughly 46MB (those catalogs tested perfectly fine in my recent tests) -- and by using Zstandard-level-17 codec for regular SUPER-fast backup compressing, those same catalogs would get down to ~75MB or so. After the forced internal compression, the now ~500MB catalog will only get down to ~136MB with those same settings (and there haven't been any new images added that could explain that large of a difference).
 
Last edited:
However I think we can both agree that we wish Adobe didn't force compression on us. Both internally since 7.0, nor externally via zipping since 6.0. But for very different reasons. :)
Absolutely, especially internally for me, as it reduces the ability to directly access SQL data that is quite useful at time. But in my mind applications should generally not be the ones doing compression of archival data -- there's WAY too many tools that might pile on top of that, from backup software to de-dup stuff, cloud related aspects.

I'm an old database guy (as in been around them since they were new, not as an old guy which is a different subject and how dare you think that :mad:) and to me an APPLICATION compressing data INSIDE of a database is just like random strangers photo-bombing your nicely composed images.

I think they were desperate to hold up some performance related wins and these looked and sounded good to the mass market.

The complaints in various forums about it appearing to happen frequently spurred this; I am not aware of Adobe making any public statements about the issue, but even if no rumors it is always well worth checking your backups!
 
Absolutely, especially internally for me, as it reduces the ability to directly access SQL data that is quite useful at time. But in my mind applications should generally not be the ones doing compression of archival data -- there's WAY too many tools that might pile on top of that, from backup software to de-dup stuff, cloud related aspects.

Actually, the internal part is the worst to me, as well. Precisely for the last thing you mention above: cloud related aspects. I use a free Google Drive with 15GB limit, cause I'm to poor to easily afford paid services. So being able to get them really small means more backups that can be synced within that space.
 
The complaints in various forums about it appearing to happen frequently spurred this; I am not aware of Adobe making any public statements about the issue, but even if no rumors it is always well worth checking your backups!
It was as a result of one of our Members here - @noborg - sending me a series of corrupted backup catalogs that made me flag it up with the product manager, and then more cases started coming to light. I don't think Adobe's said anything publicly, but I was able to reliably reproduce the issue with multiple catalogs. Hopefully we won't be waiting too long for a proper solution. It's been a good reminder of how important it is to check we can restore our backups - of any description.
 
I got around to following Ferguson's directions, and found that, while the catalog is fine, the most recent backup did indeed have errors. I haven't updated to 7.3 or 7.3.1; does anyone know whether the problem has been solved?
 
I got around to following Ferguson's directions, and found that, while the catalog is fine, the most recent backup did indeed have errors. I haven't updated to 7.3 or 7.3.1; does anyone know whether the problem has been solved?
Yes. Adobe did a mea culpa in the 7.3.1 release notes.
 
Also, I agree with what I think Ferguson said somewhere, that I think the problem was actually widespread.. Just that people don't test their backups, thinking that "everything's fine" or "that problem probably didn't affect me". The latter of which I was also guilty of for a bit after I heard about some corruption reports. ...... Either that or they did external backups in place of, or in addition to the LR generated ones - and didn't test the LR ones.

I've tested several new backups with 7.3.1, though. And they all reported "ok".
 
... or they did external backups in place of, or in addition to the LR generated ones - and didn't test the LR ones.
".
That's me - I never even counted on the LR generated backups - except if I was in very deep trouble with my normal data backups.
 
Thanks Linwood, as another old database guy. I really appreciate the way you've laid things out. (Started with Adabas, and survived the database wars largely unscathed. Was sufficiently nerdy to have Ted Codd's original research papers in my filing cabinet).

Dave
 
So I want to create a button in Directory Opus to automatically copy and compress with 7-zip Zstandard.

Hoggy,
I run Directory Opus as my file manager, but never dived into configuring it (I have a user friendly setup that I copied from the forums. Maybe I should be trying a bit harder.
Dave
 
I run Directory Opus as my file manager, but never dived into configuring it (I have a user friendly setup that I copied from the forums. Maybe I should be trying a bit harder.

Yeah, you definitely should! Although there's just so much that can be done, even myself is probably only using 10-20% of its potential. And I've been using it since its original Commodore Amiga days!

I tried learning some Javascript for creating some full-fledged scripting buttons, as opposed to the standard internal/external command button 'scripts'. But I just never got the 'connect' between what I learned and Dopus's API business. I figure eventually I get back to figuring it out, though... Right after I've forgotten what Javascript I've learned. :)
 
Status
Not open for further replies.
Back
Top