• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Quick newbie question

Status
Not open for further replies.
Joined
Apr 14, 2019
Messages
50
Lightroom Version Number
Classic 11.4.1
Operating System
  1. macOS 12 Monterey
I back up my entire Mac to Backblaze and also back up the catalog files and .DNGs to Google drive.
Is it safe to delete the old.zip saves of the old catalog files from Google Drive (mine go back years).
I'm guessing it is but just wanted to make sure I'm not doing anything really stupid.
 
Yes, it's safe. Keep one or two old catalog backups for just in case, but there is no reason to keep years of catalog backups.
 
Yes, it's safe. Keep one or two old catalog backups for just in case, but there is no reason to keep years of catalog backups.

The correct answer is that you only need to keep the backup catalogs that you need to recover. Sometimes the recovery is from a stupid user mistake (like removing all of the keywords from ~2000 images and not discovering the mistake for 6 months ). I went back over 6 months to find missing keywords. I was glad I had backups that far back.

I have a 10 year old EHD. I would not trust it for critical user data. So it has become my default backup destination. I may have 1000 catalog backups on it. I could take the time to delete most of these but I don’t bother. When the disk fails (and it will eventually) I will lose every backup on it but I consider that a low risk event. I may never need these backup catalogs.


Sent from my iPad using Tapatalk
 
A fairly typical business scheme is something like keep daily backups (if you work daily) for a month, keep an end of each week backup for a quarter, keep an end of month backup for a year, keep and end of quarter backup for a few years, etc. A continual rolling purge. It can be too much trouble to do precisely but you can invent your own scheme with the idea that if you need to go back 2 years you probably don't need to go back precisely 2 years 3 months and 4 days, just something "really old".
 
I started to say just keep your most recent couple of cat backups (like I do). But Clete just made me realize something from his answer. What if you do something dumb and don't realize it for a few weeks or even months? You can then go back to an old version of the cat.

But then you might be in a quandary of missing your recent work and would have to do some juggling.

Hmmmmm. I never thought about that.
 
I started to say just keep your most recent couple of cat backups (like I do). But Clete just made me realize something from his answer. What if you do something dumb and don't realize it for a few weeks or even months? You can then go back to an old version of the cat.

But then you might be in a quandary of missing your recent work and would have to do some juggling.

Hmmmmm. I never thought about that.

While I have yet to do it in this context exactly, with respect to catalog data (as opposed to the images themselves), you could always restore catalog OLD somewhere, open it and find what you need and export it in some fashion (as a small catalog, as images' metadata or XMP's), then close OLD and open CURRENT catalog and import/key/etc that data.

I think this tends to become more interesting in the backup of image data. I always retained versioned backups, and have had to go back to them due to bit rot, or more precisely file corruption for unknown reasons. In that case I had to go back about 2 years to find a version of the file uncorrupted. But if that were, for example, an accidentally deleted image file (and deleted from the catalog), you may need to go back to a versioned file backup for the image, and also to an old backup of the catalog for its develop and other settings (or recreate them of course, but it might be a whole folder of images).

It is important that image data (i.e. the files) AND catalog data be "versioned" or what is often called "Able to be restored to a point in time" to protect against human error, and to a much, much, much less frequently occurring bit rot.

(The term "bit rot" is more precisely corruption in a file that comes from disk write or read failure that goes undetected, and so introduces corruption silently that you only discover days, weeks or years later when you go to actually look at that image (or other data)).
 
What if you do something dumb and don't realize it for a few weeks or even months? You can then go back to an old version of the cat. But then you might be in a quandary of missing your recent work and would have to do some juggling.

Yep, I've had to do that a couple of times, the perils of living dangerously with beta software in my case!

I keep one backup from the end of each year, then every couple of months for 6-12 months ago, then every month for the last 6 months, then every week for the last couple of months. It's probably overkill for most people, but I have had to restore specific data (in that case, GPS data) from more than a year ago.
 
While I have yet to do it in this context exactly, with respect to catalog data (as opposed to the images themselves), you could always restore catalog OLD somewhere, open it and find what you need and export it in some fashion (as a small catalog, as images' metadata or XMP's), then close OLD and open CURRENT catalog and import/key/etc that data.

I think this tends to become more interesting in the backup of image data. I always retained versioned backups, and have had to go back to them due to bit rot, or more precisely file corruption for unknown reasons. In that case I had to go back about 2 years to find a version of the file uncorrupted. But if that were, for example, an accidentally deleted image file (and deleted from the catalog), you may need to go back to a versioned file backup for the image, and also to an old backup of the catalog for its develop and other settings (or recreate them of course, but it might be a whole folder of images).

It is important that image data (i.e. the files) AND catalog data be "versioned" or what is often called "Able to be restored to a point in time" to protect against human error, and to a much, much, much less frequently occurring bit rot.

(The term "bit rot" is more precisely corruption in a file that comes from disk write or read failure that goes undetected, and so introduces corruption silently that you only discover days, weeks or years later when you go to actually look at that image (or other data)).
Linwood, in terms of actual image (data) backup.... You are correct. I have used several methods of backup over the decades, and I used to retain at least two previous versions of files. But now my image files are at a minimum, 100 MB and sometimes 200MB because I shoot Medium Format, so I o longer retain previous versions on the sync and there is some risk in that.

My image files are all on one internal 8TB Sata SSD and backed up to 4 separate 8 TB spinning drives as exact copies of my image folder. I do that using GoodSync. I only backup the latest version of each file and don't keep previous versions because that would get to be a mess and require huge storage capacity. But yes, there is risk that that some of the files could be corrupted on the copy process (or sync) and I might never know it because I have many tens of thousands of image files.

For example, when I switch drives as my main data drive and reconnect LR to it after copying all the files over to the new disk (like I will do when I buy a new 8TB M.2 SSD sometime in the near future), that copy process could corrupt some amount of the files. There are programs you can run to check the integrity of every file and compare it with the original and make sure the copy is 100% correct, but that takes a while and I have never done it. For all I know, some of the raw files on my data disk might be corrupted and I would not know it because I haven't opened some of them in years....
 
My image files are all on one internal 8TB Sata SSD and backed up to 4 separate 8 TB spinning drives as exact copies of my image folder. I do that using GoodSync. I only backup the latest version of each file and don't keep previous versions because that would get to be a mess and require huge storage capacity. But yes, there is risk that that some of the files could be corrupted on the copy process (or sync) and I might never know it because I have many tens of thousands of image files.

You might be surprised, especially if you generally edit raw images in LR only, and not produce TIFF's, since those images are never changed once imported. It's one of the great features of Lightroom I think is under-appreciated. Now DNG may be a different matter if you write metadata back, as it updates the DNG and would trigger a new backup, and clearly TIFF updates every time you edit the original. But I find it a pretty small burden to do versioned backups (I use both Goodsync and Cloudberry) for images.

I tend to be paranoid and not trust software. Different people feel differently, but that distrust comes into this decision in the following way. Suppose Lightroom (or some unrelated program) got a software release that accidentally overwrite a block or two of every 10,000th file it touched. Just a mistake. That could quickly get backed up to all your backups before you even noticed, as you would not until you opened that 0.01%. With versions you get the before and after and can recover.

This sort of bad-software thing does happen. Some years ago a version of lightroom was released for Mac that just deleted all contents of the first folder alphabetically on the drive. NOT the first lightroom folder, just some other random data folder. In that case it deleted the files (though a lot of backup sync programs, including goodsync if you ask it, delete from backup if the original is deleted). I've also seen it over the years in commercial software, especially ERP systems with tens of thousands of modules, and it might not show up until some end of year process is run.

Incidentally this is why I always recommend people backup the LR catalog with their backup software PLUS let lightroom produce its own backups. This one has historical justifications -- for several versions of LR a bug corrupted every single catalog backup it made (if you had large catalogs). Every one. No apparent error. You only noticed if you tried to restore.

Software has bugs, and always will. It has gotten much better than the early days, and people get lulled into thinking nothing bad can happen because frankly it very, very, very rarely does. I just think everyone should make informed decisions (you clearly are well informed but others are reading), and hope it is useful to play out some scenarios where versioning may be needed. To me backups are best if:

- Run regularly
- Tested regularly
- Are versioned (point in time restore)
- Have some off-site copy somewhere
- Are compared to the original periodically (to detect bit rot on either side)
 
Status
Not open for further replies.
Back
Top