• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

Recommended Graphics Card for LR Classic

Status
Not open for further replies.

iwaddo

Active Member
Joined
Apr 13, 2020
Messages
130
Lightroom Version Number
9.2.1
Operating System
  1. Windows 10
Hi, I know this is not strictly a LR Classic question but I am struggling to find a list of Graphics Cards from which I might choose to upgrade, there is just too much choice.

Photoshop is using the onboard graphics bit LR Classic isn't, the motherboard is too old.

I do not want to play games, I just want to give both Photoshop and LR Classic a little performance boost.

I do not want to spend money unnecessarily but at the same time I do not have money to waste.

Could anyone please recommend a good value for money card that will be supported by Photoshop and LR Classic.

Thank you for your help.
 
I would think that any of the newer Video cards on the market with sufficient VRAM and meet the Apple "Metal" standards. You should disable the builtin GPU so that all GPU requests will be handled by the new card.
When Adobe first introduced GPU acceleration, there were a lot of older equipment that did not even support OpenGL. We've moved on from OpenGL and Metal API is the criteria to meet.
 
Puget's advice:
 

Attachments

  • Schermafdruk 2020-05-11 19.31.05.jpg
    Schermafdruk 2020-05-11 19.31.05.jpg
    83.3 KB · Views: 1,017
For Windows machines I have heard from a few sources that there is not much benefit in spending a lot of money on a graphics card for Lr and Ps as Adobe does not take full advantage of the gpu processing power. The one piece of advice I recall is the more gpu memory you can get the better.
I plan to build a high spec workstation towards the end of 2020. Unless Adobe announce new GPU enabled features or GPU improvements or more info on this subject before then I plan to spend no more than 150 to 200 usd on a gpu card.

When it is clear to that there are advantages to be had in Lr / Ps from purchasing a high end card then I will be happy to spend the money on a graphics card. For me, deciding on what Graphics card to get is the most difficult part of specifying my new build.

If you are driving a 4k screen or 2 x 4k screen or 5k screen then there are stronger reasons to increase GPU processing capability.
 
I would think that any of the newer Video cards on the market with sufficient VRAM and meet the Apple "Metal" standards. You should disable the builtin GPU so that all GPU requests will be handled by the new card.
When Adobe first introduced GPU acceleration, there were a lot of older equipment that did not even support OpenGL. We've moved on from OpenGL and Metal API is the criteria to meet.
Sorry, a little confused, does this apply to a Windows 10 PC?
 
For Windows machines I have heard from a few sources that there is not much benefit in spending a lot of money on a graphics card for Lr and Ps as Adobe does not take full advantage of the gpu processing power. The one piece of advice I recall is the more gpu memory you can get the better.
I plan to build a high spec workstation towards the end of 2020. Unless Adobe announce new GPU enabled features or GPU improvements or more info on this subject before then I plan to spend no more than 150 to 200 usd on a gpu card.

When it is clear to that there are advantages to be had in Lr / Ps from purchasing a high end card then I will be happy to spend the money on a graphics card. For me, deciding on what Graphics card to get is the most difficult part of specifying my new build.

If you are driving a 4k screen or 2 x 4k screen or 5k screen then there are stronger reasons to increase GPU processing capability.
Interesting, we do really do not want to spend money for little or no return. This is an older PC to which we have added an SSD for the C-Drive and are probably going to put in two mirrored Seagate 2Tb Barracuda 3.5" SATA 6Gb/s disks, so spending loads on a graphics card is just not doable with limited funds.
 
There may be advantages to having a graphics card as opposed to no graphics card, I am not qualified enough to form that view, especially as it relates to Lr and Ps. If you look at graphics cards at the moment, the prices go exotic very quickly and while these may provide benefits for gamers, etc... the feedback I have is that Lr /Ps does not provide return by buying high end graphics cards. I hope that will change and would love to have more clarity on this subject as I do want to build a high spec workstation and desperately want to know what is the best card to get for myself. Other factors for me are heat generated by high end graphics cards, impact on cooling of the overall workstation and impact on noise. I do not want to have a super duper workstation, with noisey cpu and graphics fans running all day.
 
For Windows, you need to have DirectX12 supported. This implies to use Windows 10 (DirectX12 is not supported by Windows 7 and previous) and a graphic card recent enough (most actual GC support DirectX12).
I second the point on the size of memory on the GC. This is very important, especially with the bg files generated by the modern camera and the increasing size of the display. The larger memory the better.
 
Sorry, a little confused, does this apply to a Windows 10 PC?
it applies to the Graphics card that you want to purchase for your Windows computer any Graphics card with sufficient VRAM and Metal support will give you the performance level that you are looking for for Lightroom
 
Someone just sent me a link to this page on the nVidia website. Note that you need latest model 2060 card to get GPU support for Enhanced Details.

Your Favorite Creative Apps Accelerated With NVIDIA GPUs. nVidia is rumored to have a new generation announced in Q3, but with a card that replaces the 2080, which sells for US $650 to over $1000. Too rich for my blood. Over time, (but who knows how much time) there will be a replacement for the 2060.

Phil Burton
 
Until I can see a clear case for buying a high end graphics card I would use a list like this to help me select a graphics card for a custom build.
https://www.videocardbenchmark.net/gpu_value.html
I would go thru the list checking the combination of performance, value for money and actual graphics memory.

I have just purchased a high end Dell laptop, which has the GEForce Nvidia 1650 card. The list above prices that card at USD 140.
 
Until I can see a clear case for buying a high end graphics card I would use a list like this to help me select a graphics card for a custom build.
https://www.videocardbenchmark.net/gpu_value.html
I would go thru the list checking the combination of performance, value for money and actual graphics memory.

I have just purchased a high end Dell laptop, which has the GEForce Nvidia 1650 card. The list above prices that card at USD 140.
At Newegg, the cheapest 1650 card is $160.

For me, personally, Enhanced Details is important because I do a lot of railroad/rail transit photography and for these photos "bokeh" is not exactly an issue. Detail is, the more the better. So I have to decide if I'm willing to pay $200 more for a 2060 to get GPU support (and much faster processing times) for Enhanced Details. Personal decision of course.
 
Sorry, a little confused, does this apply to a Windows 10 PC?
it applies to the Graphics card that you want to purchase for your Windows computer any Graphics card with sufficient VRAM and Metal support will give you the performance level that you are looking for for Lightroom
Just to be clear, Windows and Mac users have separate and quite different criteria and constraints with regards to graphics cards:
  • For Windows, Nvidia is a strong choice, and Apple Metal support should not be part of the discussion because Metal is an Apple-only API.
  • For macOS, it's the opposite: Apple Metal performance is critical because Adobe Mac applications including Lightroom are building their Mac GPU support around this API, and Nvidia should not be part of the discussion because Nvidia graphics have zero driver support for any current Mac.
Because of that, a GPU discussion with either a Windows or Mac user should not mention both Metal and Nvidia. The discussion will involve one or the other, since neither platform supports both.

Right now, making a long-term decision about a graphics card should not hinge too much on Lightroom Classic, because GPU support in Lightroom is currently limited, but improving over time, and Adobe is probably not done with it. In other words, it’s a moving target. Currently, a GPU helps the most in the Develop module and for displays and raw images with very high pixel counts. And just as important in the GPU buying decision is understanding that in Lightroom Classic outside of those areas (e.g. for exporting), a GPU doesn’t help, no matter how much you spend.

The GPU buying decision should depend less on Lightroom and more on how many large displays you plan to drive, and on the needs of other, GPU-hungrier applications you might use like your video editing application, games, 3D... A GPU that can satisfy those needs is probably going to be fine for Lightroom Classic for the next few years. If Lightroom Classic is your primary application, then for a Windows 10 computer it's probably good to use the Puget Systems GPU guidance for Lightroom Classic as a guide.
 
Just to be clear, Windows and Mac users have separate and quite different criteria and constraints with regards to graphics cards:
  • For Windows, Nvidia is a strong choice, and Apple Metal support should not be part of the discussion because Metal is an Apple-only API.
  • For macOS, it's the opposite: Apple Metal performance is critical because Adobe Mac applications including Lightroom are building their Mac GPU support around this API, and Nvidia should not be part of the discussion because Nvidia graphics have zero driver support for any current Mac.
Because of that, a GPU discussion with either a Windows or Mac user should not mention both Metal and Nvidia. The discussion will involve one or the other, since neither platform supports both.

Right now, making a long-term decision about a graphics card should not hinge too much on Lightroom Classic, because GPU support in Lightroom is currently limited, but improving over time, and Adobe is probably not done with it. In other words, it’s a moving target. Currently, a GPU helps the most in the Develop module and for displays and raw images with very high pixel counts.

The GPU buying decision should depend less on Lightroom and more on how many large displays you plan to drive, and on the needs of other, GPU-hungrier applications you might use like your video editing application, games, 3D... A GPU that can satisfy those needs is probably going to be fine for Lightroom Classic for the next few years. If Lightroom Classic is your primary application, then for a Windows 10 computer it's probably good to use the Puget Systems GPU guidance for Lightroom Classic as a guide.
Thank you, this is really helpful.
 
Just to be clear, Windows and Mac users have separate and quite different criteria and constraints with regards to graphics cards:
I disagree for this reason. All current "metal API" cards are supported in Lightroom. Mac Users that have a current Graphic adapter do not have GPU issues. Windows users have the most GPU problems. This is due to cards that have not been tested against Lightroom or have outmoded or faulty Video drivers. By picking a card that supports the metal API, a Windows user will end up with a card that will in all likely hood be supported and work with Lightroom GPU acceleration.
 
I look forward to this discussion as this is a subject which requires clarity..

I have just checked the specs of the Nvidia RTX 2080 Super from the Nvidia site (Your Graphics, Now With SUPER Powers) and it makes no reference to Metal support in the specification panel. I cannot see a scenario where Nvidia cards are excluded from selection for a Lr/Ps built machine.
 
I look forward to this discussion as this is a subject which requires clarity..

I have just checked the specs of the Nvidia RTX 2080 Super from the Nvidia site (Your Graphics, Now With SUPER Powers) and it makes no reference to Metal support in the specification panel. I cannot see a scenario where Nvidia cards are excluded from selection for a Lr/Ps built machine.
My reasoning is to address the OPs question. I have no doubt that there are non "Metal" cards that work wonderfully in Lightroom Classic. The Nvidia RTX 2080 Super might very well be one of them. It is just that my observation here on this forum is that Windows users have most of the GPU issues. I think this is because there are so many permutations of hardware , a variety of MBs some with builtin GPU, some as add on. Different CPUs etc. that it is difficult to test every permutation of hardware and add to that the background software also running that may interfere with Lightroom Classic's use of the GPU.

I'm just trying to simplify the number of variables .
 
Last edited:
I disagree for this reason. All current "metal API" cards are supported in Lightroom. Mac Users that have a current Graphic adapter do not have GPU issues. Windows users have the most GPU problems. This is due to cards that have not been tested against Lightroom or have outmoded or faulty Video drivers. By picking a card that supports the metal API, a Windows user will end up with a card that will in all likely hood be supported and work with Lightroom GPU acceleration.
Cletus,

Until a few weeks ago, I was planning to upgrade my new Windows 10 build with an AMD RX 5600-XT, since it seemed to be competitive with nVidia cards costing $50 (or more) than this card. Then I read reviews and forum discussions concerning this card, and the short answer is, I need reliable drivers that don't freeze up or crash the system a lot. To be clear, many of the reviews were about specific games, but other reviews lamented the fact that AMD's driver quality is, shall we say, not great. In my heart I really wanted to get an AMD card, but I also need to be realistic about having a stable system.

I don't mean to imply that money passes through my fingers like water, it doesn't. But if I need to get relatively reliable drivers, then I have to Pay The Man the Money. I know that nVidia drivers also have issues, but I have been using a GTX 660 card in my old system since around 2014, and it has been driver-trouble free.
 
Some basic facts, nVidia at this moment are not supported by Apple, in response to this, (Tit for Tat? ), nVidia does not support Apple specific options such as "Metal"

Looking at the general Graphics card supply system for PC, nVidia are currently at the forefront against AMD, So I would recommend that you look at nVidia cards, at present, AMD cards can however give more "bang per buck" so with a lower budget they can be a good choice.

The needs of Lightroom for a graphics card is not that high I currently run an nVidia Geforce GTX 1050ti 4GB and as can be seen in the included Screenshot, gives full acceleration ...

Annotation 2020-05-20 081943.png

in addition my system uses an I7 9700K @4 GHz, the LR Catalogue is located on an SSD and the image store is on a 7200 rpm Optane accelerated HDD so the generall spec is quite reasonable, which will have an affect on performance.
I would suggest that any nVidia Card from the 4GB 1080 Ti upwards would allow full acceleration on your system, and to be honest, ANY modern discrete GPU would give a visible improvement in performance over the current onboard graphics, (this applies to any card, including AMD, as it would free the current Shared RAM for use for image processing, by replacing it with faster vRAM on the card, even if the full acceleration was not possible on the general hardware level available .
 
Cletus,

Until a few weeks ago, I was planning to upgrade my new Windows 10 build with an AMD RX 5600-XT, since it seemed to be competitive with nVidia cards costing $50 (or more) than this card. Then I read reviews and forum discussions concerning this card, and the short answer is, I need reliable drivers that don't freeze up or crash the system a lot. To be clear, many of the reviews were about specific games, but other reviews lamented the fact that AMD's driver quality is, shall we say, not great. In my heart I really wanted to get an AMD card, but I also need to be realistic about having a stable system.
.It is interesting that you come to this conclusion as (at least for my iMac) Apple used the AMD Radeon Pro 560 4 GB.
I pulled this quote from the internet "The average gaming performance in Windows of the Radeon Pro 460 is just above the GeForce GTX 960M and below an average GTX 965M ein. Compared to the slightly raised clocks of the desktop 500-series models, the Radeon Pro 560 should only be slightly faster." YMMV.
 
Some basic facts, nVidia at this moment are not supported by Apple, in response to this, (Tit for Tat? ), nVidia does not support Apple specific options such as "Metal
It would make no sense for nVidia to try to sell their cards into the MacOS market, because Mac users don't build their own systems, unlike many Windows users. And if you are a Mac user who wants improved graphics performance, isn't the obvious choice to get a faster AMD card, for which Apple provides drivers? The situations are not parallel.
Looking at the general Graphics card supply system for PC, nVidia are currently at the forefront against AMD, So I would recommend that you look at nVidia cards, at present, AMD cards can however give more "bang per buck" so with a lower budget they can be a good choice.
We are in agreement about the "bang per buck" for AMD cards. But please see my comments above about driver quality.
The needs of Lightroom for a graphics card is not that high I currently run an nVidia Geforce GTX 1050ti 4GB and as can be seen in the included Screenshot, gives full acceleration ...
Please see my comment above regarding Enhanced Details.

Phil
 
It would make no sense for nVidia to try to sell their cards into the MacOS market, because Mac users don't build their own systems, unlike many Windows users. And if you are a Mac user who wants improved graphics performance, isn't the obvious choice to get a faster AMD card, for which Apple provides drivers? The situations are not parallel.
There are reasons why Nvidia GPUs are not available for Macs right now, but they're different reasons than those. There are Mac users who do like to build their own systems; they are the ones who buy the Mac Pro tower models with multiple PCI card slots and drive bays, or people like me who dropped a graphics card into an eGPU enclosure to augment my Mac laptop’s integrated graphics. You can still put Nvidia GPUs into the older Mac Pros (2006–2012); the models lacking Nvidia driver support are the ones requiring later versions of macOS.

And for MacBook Pro laptops, Nvidia was the supplier of their discrete GPUs for many years. But it is exactly that relationship that is said to be why Nvidia is not currently supported on Macs. The reasons are not technical or related to the users, but strictly political. It is said that the high failure rate of Nvidia mobile GPUs (which affected at least one of my MacBook Pros) soured Apple on Nvidia because of what it cost Apple to run a service program to repair/replace those at no cost to the customer…a fix that did not always stick. And when that Nvidia GPU died for good, you could not start the computer, you had to write off that motherboard. Also Nvidia has worked hard to try and get their proprietary CUDA architecture to be a de facto standard, and this is at odds with Apple wanting to build developer support for the Apple Metal graphics API.

There are quite a few Mac users who would love to have Nvidia support, but until Apple and Nvidia can come to some kind of agreement, it is not going to happen.
 
There are reasons why Nvidia GPUs are not available for Macs right now, but they're different reasons than those. There are Mac users who do like to build their own systems; they are the ones who buy the Mac Pro tower models with multiple PCI card slots and drive bays, or people like me who dropped a graphics card into an eGPU enclosure to augment my Mac laptop’s integrated graphics. You can still put Nvidia GPUs into the older Mac Pros (2006–2012); the models lacking Nvidia driver support are the ones requiring later versions of macOS.

And for MacBook Pro laptops, Nvidia was the supplier of their discrete GPUs for many years. But it is exactly that relationship that is said to be why Nvidia is not currently supported on Macs. The reasons are not technical or related to the users, but strictly political. It is said that the high failure rate of Nvidia mobile GPUs (which affected at least one of my MacBook Pros) soured Apple on Nvidia because of what it cost Apple to run a service program to repair/replace those at no cost to the customer…a fix that did not always stick. And when that Nvidia GPU died for good, you could not start the computer, you had to write off that motherboard. Also Nvidia has worked hard to try and get their proprietary CUDA architecture to be a de facto standard, and this is at odds with Apple wanting to build developer support for the Apple Metal graphics API.

There are quite a few Mac users who would love to have Nvidia support, but until Apple and Nvidia can come to some kind of agreement, it is not going to happen.
Conrad,

Since I haven't owned a Mac since 1995, I am completely unaware of the bad history between Apple and nVidia. Given that history, it's not surprising that Apple won't use nVidia boards/chips any longer. I wouldn't call that decision "political." I would call that a "good business decision," because Apple suffered the reputational and financial damage caused by one of their suppliers.

However, I will still surmise that the number of Mac users who own tower models or use eGPUs is probably too small to justify nVidia's investment in Mac drivers. Or, it is entirely possible that modern versions of MacOS actively block nVidia cards from operating,

Considering that high-end nVidia cards easily outperform the current top of the line AMD cards, it is quite understandable that such people would want to have support for nVidia hardware.
 
Status
Not open for further replies.
Back
Top