• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!

Lightroom Classic RTX3060 Performance?

Status
Not open for further replies.

angusgibbinsau

New Member
Joined
Jan 18, 2023
Messages
7
Lightroom Version Number
12.1
Operating System
  1. Windows 11
Hi,

Long term LRQ lurker, first time poster. Wanted to know if anyone has any LrC experience with the Nvidia RTX3060?

Specifically, I currently have a GTX1060 6G and wanted to know if I'd get any noticeable performance ugprades by going to a Nvidia RTX3060 12G? Or is LrC just too much of a spaghetti of legacy code and framework to make any real difference? Opinions/advice appreciated.

Thanks!
 
Welcome to the forum. I do not have either Nvidia card, but a short answer to your question is "it depends". It will depend on what actions you are doing and what the rest of your system looks like. Additional information can help folks with answering your question.

Thanks,

--Ken
 
Hi,

Long term LRQ lurker, first time poster. Wanted to know if anyone has any LrC experience with the Nvidia RTX3060?

Specifically, I currently have a GTX1060 6G and wanted to know if I'd get any noticeable performance ugprades by going to a Nvidia RTX3060 12G? Or is LrC just too much of a spaghetti of legacy code and framework to make any real difference? Opinions/advice appreciated.

Thanks!
I have recently upgraded from a GTX 1050ti to a RTX 3060 (not ti). I find Lightroom Desktop noticeably faster and more pleasant to use. For example, the picker in Colour Balance works exactly as you would expect - with my old 1050ti it was irritatingly laggy. Generally, everything is more responsive and snappier. I'm not sure whether the difference is attributable to the extra VRAM or the better processing power, but for me it was a worthwhile upgrade. GPUs are, as you will know, still stupidly expensive, but the 3060 is at the saner end of the scale.

My system is a Core i9 with 16GB RAM, and I use a 4k monitor.
 
I have recently upgraded from a GTX 1050ti to a RTX 3060 (not ti). I find Lightroom Desktop noticeably faster and more pleasant to use. For example, the picker in Colour Balance works exactly as you would expect - with my old 1050ti it was irritatingly laggy. Generally, everything is more responsive and snappier. I'm not sure whether the difference is attributable to the extra VRAM or the better processing power, but for me it was a worthwhile upgrade. GPUs are, as you will know, still stupidly expensive, but the 3060 is at the saner end of the scale.

My system is a Core i9 with 16GB RAM, and I use a 4k monitor.
Did you just upgrade your GPU, or were there other upgrades to your system as well?

--Ken
 
I have recently upgraded from a GTX 1050ti to a RTX 3060 (not ti). I find Lightroom Desktop noticeably faster and more pleasant to use. For example, the picker in Colour Balance works exactly as you would expect - with my old 1050ti it was irritatingly laggy. Generally, everything is more responsive and snappier. I'm not sure whether the difference is attributable to the extra VRAM or the better processing power, but for me it was a worthwhile upgrade. GPUs are, as you will know, still stupidly expensive, but the 3060 is at the saner end of the scale.

My system is a Core i9 with 16GB RAM, and I use a 4k monitor.
Thanks. Are you using Lightroom or Lightroom Classic?
 
Welcome to the forum. I do not have either Nvidia card, but a short answer to your question is "it depends". It will depend on what actions you are doing and what the rest of your system looks like. Additional information can help folks with answering your question.

Thanks,

--Ken
It's not really an "it depends" question in this instance TBH. Lightroom Classic is a bohemeth beast of legacy code and obsolete frameworks. It in itself will always be a major bottleneck and in a lot of cases (such as mine I suspect), upgrading hardware when the hardware is already more than adequate won't make a difference. The reason being is because LRC just isn't capable of fully utilising it's capabilities, it's too much of a bohemeth.

Lightroom (Cloud) doesn't have this problem because it's a ground up rewrite and doesn't have a lot of these legacies embedded within. It's a lot easier for it to be streamlined and make use for newer hardware capabilities.

Unfortunately at this time I can't migrate to Lightroom (and I want to) because the feature parity isn't quite there yet (mainly smart collections and nested keywords).
 
It's not really an "it depends" question in this instance TBH. Lightroom Classic is a bohemeth beast of legacy code and obsolete frameworks. It in itself will always be a major bottleneck and in a lot of cases (such as mine I suspect), upgrading hardware when the hardware is already more than adequate won't make a difference. The reason being is because LRC just isn't capable of fully utilising it's capabilities, it's too much of a bohemeth.

Lightroom (Cloud) doesn't have this problem because it's a ground up rewrite and doesn't have a lot of these legacies embedded within. It's a lot easier for it to be streamlined and make use for newer hardware capabilities.

Unfortunately at this time I can't migrate to Lightroom (and I want to) because the feature parity isn't quite there yet (mainly smart collections and nested keywords).
Disclaimer: I'm not a developer and I don't/never did work at Adobe.

Why do you assume that LrC is all "legacy" code? And why do you assume that Lightroom cloudy was a ground up rewrite?

I suspect that some legacy code was cleaned up in the conversion of LR pre-Classic from 32 bit to 64 bit code.

To respond to the OP, I upgraded from an old NVidia 660 Ti to a 3060 Ti (not plain old 3060) and I saw significant improvements in some operations. I also got improvements in LrC when I upgraded from an Intel 3930K CPU based system to an AMX 3900x CPU based system, not all at once of course. I also got some improvements when I upgraded from a "legacy" SATA SSD to an NVMe stick SSD.

However, my biggest performance improvement in LrC was a better workflow and a better understanding of LrC functionality.

My comment is that "it depends" on which actions you are performing in LrC.
 
It's not really an "it depends" question in this instance TBH. Lightroom Classic is a bohemeth beast of legacy code and obsolete frameworks. It in itself will always be a major bottleneck and in a lot of cases (such as mine I suspect), upgrading hardware when the hardware is already more than adequate won't make a difference. The reason being is because LRC just isn't capable of fully utilising it's capabilities, it's too much of a bohemeth.

Lightroom (Cloud) doesn't have this problem because it's a ground up rewrite and doesn't have a lot of these legacies embedded within. It's a lot easier for it to be streamlined and make use for newer hardware capabilities.

Unfortunately at this time I can't migrate to Lightroom (and I want to) because the feature parity isn't quite there yet (mainly smart collections and nested keywords).
But it is an "it depends" question as to how you use LRC, and where you are expecting to see performance gains. The gains of a better GPU are not necessarily across the board in all LRC functions. So, for example, if you are expecting faster imports from a better GPU, you may be more disappointed than if exporting large numbers of files. That was why I said "it depends". There is no one standard measure of GPU performance in LRC.

--Ken
 
Disclaimer: I'm not a developer and I don't/never did work at Adobe.

Why do you assume that LrC is all "legacy" code?
I didn't say it was all legacy code. I said it is a bohemeth of legacy code and frameworks. This isn't exclusive to LrC either BYW, Adobe is notorious for it, this is a pretty well known fact among creative industries.
And why do you assume that Lightroom cloudy was a ground up rewrite?
Because that's essentially it is.
I suspect that some legacy code was cleaned up in the conversion of LR pre-Classic from 32 bit to 64 bit code.
Legacy x86 code sure. But there's still a limit to what they can clean up and what performance gains they can get from it.
However, my biggest performance improvement in LrC was a better workflow and a better understanding of LrC functionality.
I'm happy with my LrC workflow and am not looking to make changes to it. Additionally my workflow can't make up for bottlenecks within software/hardware.
My comment is that "it depends" on which actions you are performing in LrC.
I mean. Essentially most of us do some variation of:

1. Import RAW files.
2. Organise images.
3. Process images.
4. Export RAW files.
 
But it is an "it depends" question as to how you use LRC, and where you are expecting to see performance gains. The gains of a better GPU are not necessarily across the board in all LRC functions. So, for example, if you are expecting faster imports from a better GPU, you may be more disappointed than if exporting large numbers of files. That was why I said "it depends". There is no one standard measure of GPU performance in LRC.

--Ken
For the sake of argument:
I import RAW files
I process RAW files
I export JPEG files
I'll sometimes round trip to a third party plugin (like DxO PhotoRAW) or Photoshop.

By far the biggest time sink is generating Standard previews upon import (and smart previews, which I don't do for every single import, only if I'll need them). And also glitchiness when using the healing and masking tools, particular when using a Wacom tablet.

Let me rephrase my original question though, because of Lightroom Classic's code base and architecture (which is archaic, it will always be archaic, nature of the beast): There will always be a ceiling of what can be accomplished by throwing more hardware resources at LRC, will the new graphics card specified raise that ceiling? Or will it be a waste of money?

Unfortunately Lightroom Classic being the fickle beast that it is, I can't really rely on performance benchmarks to gauge if an upgrade will help me at all.

Also I know I can adjust my workflow and expectations and import photos before I go to bed and the previews will be generated by morning, but software is supposed to work for me and fit how I work, not the other way round.
 
It's not really an "it depends" question in this instance TBH. Lightroom Classic is a bohemeth beast of legacy code and obsolete frameworks. It in itself will always be a major bottleneck and in a lot of cases (such as mine I suspect), upgrading hardware when the hardware is already more than adequate won't make a difference. The reason being is because LRC just isn't capable of fully utilising it's capabilities, it's too much of a bohemeth.

Lightroom (Cloud) doesn't have this problem because it's a ground up rewrite and doesn't have a lot of these legacies embedded within. It's a lot easier for it to be streamlined and make use for newer hardware capabilities.

Unfortunately at this time I can't migrate to Lightroom (and I want to) because the feature parity isn't quite there yet (mainly smart collections and nested keywords).
There is much about this response that I do not agree with, and you are painting a wrong picture for the OP. There is legacy code that I'm sure Adobe wishes they could wipe clean as the years go by, but they have made very significant strides recently in utilizing more cores and threads as well as the GPU for more tasks. For example, last year they put the GPU to work very efficiently on exports. Moving to a 3060 will pay him dividends. It did for me last year when I moved to the 3080. I am about to build with the fastest gaming card in the world right now, the 4090, and it is going to help and even more over the next two years as Adobe continues to make LR better in terms of power, efficiency and use of memory, cores, threads, the GPUs and the new incredibly powerful top-end CPUs.
You are 100% wrong when you say better hardware doesn't help. Adobe just made a big leap with AI masking that requires a lot of memory and computational power. That is a good thing. If everyone on this board had a new mid-range desktop (windows or Mac) 90% of the posted problems here would vanish. I'm not saying everyone has to do that. I'm just saying t6he hardware matters with LR and PS. Why do you think all the bench testers in my PC magazines and forums run PS and LR on their tests with new machines? You can see the improvements as the months go by and new products are released.
You Sir, are incorrect and remind me of the Adobe-hating posters on all the photography forums. Some of those guys are paid trolls. I'm not saying you are. But your post is wrong in many regards.
 
Hi,

Long term LRQ lurker, first time poster. Wanted to know if anyone has any LrC experience with the Nvidia RTX3060?

Specifically, I currently have a GTX1060 6G and wanted to know if I'd get any noticeable performance ugprades by going to a Nvidia RTX3060 12G? Or is LrC just too much of a spaghetti of legacy code and framework to make any real difference? Opinions/advice appreciated.

Thanks!
See my comment above to some bad advice. Yes, get the 3060. It is cheap right now and you will see a difference not just with LR, but with all kinds of stuff. Yes it makes a difference. Yes. Yes. Yes. And it will make more of a difference over the next year or two. But get the 3080. Those prices are falling as the 40 series zooms forth.
 
There is much about this response that I do not agree with, and you are painting a wrong picture for the OP. There is legacy code that I'm sure Adobe wishes they could wipe clean as the years go by, but they have made very significant strides recently in utilizing more cores and threads as well as the GPU for more tasks. For example, last year they put the GPU to work very efficiently on exports. Moving to a 3060 will pay him dividends. It did for me last year when I moved to the 3080. I am about to build with the fastest gaming card in the world right now, the 4090, and it is going to help and even more over the next two years as Adobe continues to make LR better in terms of power, efficiency and use of memory, cores, threads, the GPUs and the new incredibly powerful top-end CPUs.
I am the OP
1674142181347.png

You are 100% wrong when you say better hardware doesn't help.
That's not what I said at all. My argument wasn't that better hardware doesn't help, my argument was that there is a limit to how much better hardware can help with the nature of Lightroom Classic's underlying architecture, Lightroom Classic itself will always be a bottleneck.

This is articulated pretty well here
What GPU (video card) is best for Lightroom Classic?
Lightroom Classic cannot effectively utilize a high-end GPU at the moment, but we still recommend a mid-range GPU as many other related application like Photoshop do use the GPU more heavily. For most users, an NVIDIA GeForce RTX 3070 Ti is a solid choice. The exception to this is the “Enhanced Details” feature which can greatly benefit from having a more powerful GPU.
Why do you think all the bench testers in my PC magazines and forums run PS and LR on their tests with new machines?
Do they though? I can find a few specific reviews around Photoshop, a few around LR Classic. I tend to find a lot of information on 3D redendering, drop shadows and the like, not so much around Photoshop or specifically LRC performance.
You Sir, are incorrect and remind me of the Adobe-hating posters on all the photography forums. Some of those guys are paid trolls. I'm not saying you are. But your post is wrong in many regards.
You do you.

That said, I think I'm going to give the 3060 (probably Ti) a go. So thanks for the advice.
 
Adobe just made a big leap with AI masking that requires a lot of memory and computational power.
That actually is a good point and something that I hadn't thought of, new masking and healing features are probably one of the biggest feature improvements in LrC in years. I can see extra GPU grunt would be a big deal there.
 
For the sake of argument:
I import RAW files
I process RAW files
I export JPEG files
I'll sometimes round trip to a third party plugin (like DxO PhotoRAW) or Photoshop.

By far the biggest time sink is generating Standard previews upon import (and smart previews, which I don't do for every single import, only if I'll need them). And also glitchiness when using the healing and masking tools, particular when using a Wacom tablet.

Let me rephrase my original question though, because of Lightroom Classic's code base and architecture (which is archaic, it will always be archaic, nature of the beast): There will always be a ceiling of what can be accomplished by throwing more hardware resources at LRC, will the new graphics card specified raise that ceiling? Or will it be a waste of money?

Unfortunately Lightroom Classic being the fickle beast that it is, I can't really rely on performance benchmarks to gauge if an upgrade will help me at all.

Also I know I can adjust my workflow and expectations and import photos before I go to bed and the previews will be generated by morning, but software is supposed to work for me and fit how I work, not the other way round.
I appreciate the rephrasing and would ask the same question if I was in your shoes. As I am long overdue for a hardware upgrade, I cannot give any personal experience on what this type of upgrade would provide. In part of his posts above, GregJ did mention some improvements and did also callout more intense computations that should greatly benefit from a powerful GPU. But how powerful, and how much more benefit from a more powerful GPU, is not something I can personally answer. As this thread bumps back up from replies, you may get a reply from somebody who has more direct experience that they are will to share.

--Ken
 
Almost a year ago I built a new system based on an Intel i7 12700K (12th gen) CPU, 32GB of DDR5 memory, Z690 motherboard, a couple of 1TB NVMe drives plus a couple of legacy HDD for backups. The CPU has an integrated GPU. LrC has always been CPU oriented but everything I've read implied that those days are over and GPUs are the future. I was not keen to add a GPU particularly not a whopping big gamer (RTX3060) commensurate with the rest of the system. I left it out in the hope that I could get by with the relatively small integrated GPU. A discrete GPU could always be added later. At first I had a number of issues running LrC. It seemed a discrete GPU was inevitable. However, the 12th gen CPU, Z690 motherboard and Windows 11 were relatively new. Software and driver updates came through regularly. Gradually things improved, then with LrC V12.1, I finally had a system that was stable and lightning fast, masks work beautifully and fast. With greater use of AI, there may come a time when a discrete GPU will be needed but I'm thinking the next upgrade might be to a 13th+ gen i9 and only if that doesn't do it, then a GPU. The takeaway here is that upgrading the system may be a better option, horses for courses.
 
Bob, I started to say that you were probably the only guy in the World to build a rig like that and not add a GPU. But you built it a year ago when GPU prices were crazy and double or triple MSRP. Keep an eye on GeForce RTX 3070 prices. I'm waiting to build right now because the 4090 is selling at over 2 grand and MSRP is 1599. But the 30 series prices are dropping fast as the 40 series emerges and really good new AMD cards are coming out, which adds to the downward price pressure of the 30 series pricing.
The 3070 is un der 450 bucks now. Keep your eye on it and add it to your rig when the price gets around 400 bucks. You won't regret it and it will help LR! You will notice it. I guarantee it.
 
Bob, I started to say that you were probably the only guy in the World to build a rig like that and not add a GPU. But you built it a year ago when GPU prices were crazy and double or triple MSRP. Keep an eye on GeForce RTX 3070 prices. I'm waiting to build right now because the 4090 is selling at over 2 grand and MSRP is 1599. But the 30 series prices are dropping fast as the 40 series emerges and really good new AMD cards are coming out, which adds to the downward price pressure of the 30 series pricing.
The 3070 is un der 450 bucks now. Keep your eye on it and add it to your rig when the price gets around 400 bucks. You won't regret it and it will help LR! You will notice it. I guarantee it.
Will do. I built the system as I did, because I wanted longevity. If it lasts as long as the last one I built, I'll probably beyond caring. I'm not a gamer so don't need a monster GPU but the option is there should future LrC upgrades or other photo processing software demand it. Yes, I was hoping that with supply easing, crypto crashing and the 4000 series coming out, 3000 GPUs would crash but without the need being there, it hasn't inspired me.
 
Status
Not open for further replies.
Back
Top