• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.

My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)

Status
Not open for further replies.

Gnits

Matt O’Brien
Premium Classic Member
Joined
Mar 21, 2015
Messages
3,084
Location
Dublin, Ireland.
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic 12.2.1
Operating System
  1. Windows 10
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

"This is a relatively long discussion. It is worth reading the discussion in sequence, as posted."

To assist people returning to this discussion, the following are direct links to frequently used posts.

Post #20 contains a link to a raw file to use, if you wish to compare the performance of your rig to others.
https://www.lightroomqueen.com/comm...e-nvidia-rtx-4070-ti-12gb.47572/#post-1315509

Post #30 contains a summary of Denoise performance stats for a range of GPU’s and CPU’s from readers of this post.
https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I am posting this on the Lightroom Classic forum, as I think it is most relevant to Classic users.

With GPU prices softening and the release of the 4xxx range I decided to purchase the Gigabyte Nvidia RTX 4070 Ti 12GB. I was aware, before purchase that I needed a min 750 Watt GPU. I already had a recently purchased Corsair RM850X model and plenty of spare Corsair supplied compatible internal power cables.

The GPU was tricky to install because of its size and the need to install an awkward bracket support system. The GPU was supplied with a Y cable, where the single end was plugged into the GPU and the two other ends plugged into separate power cables (not supplied by Gigabyte) which were then plugged into the PSU. I used Corsair cables, supplied with my PSU.

I triple checked that I had the polarity of all the cable connections correct, square pins fitted into square holes and rounded pins fitted into rounded holes etc. When I powered up my PC….. nothing happened. Nothing. No lights, no fans starting up, no status numbers on the internal motherboard display. Totally dead... and no clue as to what the problem might be. I feared the worst.

I completely disassembled my PC and removed the Corsair power supply. I then followed the Corsair instructions on how to test the PSU. All tests passed and all (28 I think) pin outs had exactly the correct voltage settings.

I rebuilt my rig, this time reinstalling my old GPU. With much relief, it booted perfectly. I was worried that many of my main components, such as motherboard, processor, PSU, etc had been toasted.

I sent a query to Gigabyte, requesting advice and trouble shooting steps, detailing the steps I had taken to test my PSU and my overall rig.

At the same time, I started researching any known issues with this range of GPU’s. I was horrified to discover many incidents where GPU’s, cables and other components had melted following installation of 4xxx GPU’s. My heart sunk. Many of the horror stories pointed to the supplied Y cable as the source. GPU makers pointed towards incorrectly seated motherboards, gpus and or cables. I can confirm that I had triple checked my connections and listened in each case for the ‘click’ as firm connections were made.

I ordered a Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power cable , directly from Corsair. It was been shipped from France and would take a week.

6 days passed and two events occurred on the same day. 1). My Corsair 12VHPWR cable arrived and 2). I received a response from Gigabyte. Essentially, Gigabyte told me to check my GPU (which I had already told Gigabyte I had done) or contact the supplier of the GPU (Amazon)).

So, I installed the Corsair 12VHPWR cable I had just purchased, connecting the GPU to the power supply. There was no need for me to triple check all the connections as the connectors would only connect in the correct way. I gingerly turned on power to the PSU and pressed the start button on the PC….. … the fan on the PSU slowly started to turn …. lights appeared within the GPU, numbers started to appear on the motherboard led display and the boot sequence started.

My PC…came to life… with the sound of the 'Hallelujah Chorus' reverberating in the background.



My Summary.

1. The response from Gigabyte was totally unacceptable.
2. In my opinion, the Y cable supplied by Gigabyte was inappropriate for the intended purpose.
3. Gigabytes response was seriously underwhelming and totally ignored the elephant in the room… ie the Y power cable.

My advice to anyone installing a modern, fairly high powered GPU is to triple check the connections needed to support the GPU and procure a single, fit for purpose, cable assembly which connects the GPU directly to the PSU, without any 3rd party connectors. Make sure you have a modern PSU of sufficient wattage and make sure you have sufficient spare slots in your PSU to cater for twin connections if required to power your GPU.

Ps. I have noticed a very pleasing improvement in Lr responsiveness…... very timely… as the third event which occurred on the same day was the release of a new Lr version, with the new GPU focused noise reduction and mask features. Serendipity ??? ‘Sweet’.

I am posting this story in the hope that others will not have to experience the pain and anguish I have done and help avoid a potential catastrophic meltdown of lots of very expensive components. I know every config is different ... but the bottom line is that modern GPUs are power hungry and need cables fit for purpose.

Here is the cable I purchased directly from Corsair. In my opinion, this should be supplied with GPU's which need dual PSU power slots.

CorsairPsuGPUCable.jpg
 
Last edited by a moderator:
Fixed, thanks.

Come on guys, anyone else have some interesting GPU's?
Okay. With my off the shelf Acer Aspire TC-885/integrated Intel UHD Graphics 630 card: I was in 11 minutes and the progress bar was at 1/3. The monitor screen was fluttering about once every 4 seconds. I shut down the process before the process shut down me! My plan is to build a new PC.
 
Just upgraded to a new build Win 11 desktop with RTX 3060 Ti 8GB GPU, i5-13600 CPU, 32 Gb RAM and M.2 SSD. Denoised in under 30 seconds - probably nearer 20 as I was a bit slow getting the timer set!
 
PC
CPU : AMD Ryzen 9 5900X
Memory : 32GB
GPU : GTX 1070Ti 8GB
O/S Win 10

Denoise : Est 60 secs Actual 98 secs

Laptop
CPU : i7-11800H 8-Core
Memory : 32GB
GPU : RTX 3060 6GB
O/S Win 10

Denoise : Est 20 secs Actual 28 secs

Canon R5 files

I'm looking at upgrading my GPU in my PC to a RTX 4070 Ti and found this post and forum while researching.
I recently upgraded from a 1700x to the 5900x on my MSI X370 Gaming Pro Carbon.
 
PC Custom build, ca. 3 years old
CPU : Intel(R) Core(TM) i7-9700K CPU @ 3.60GHz 3.60 GHz
Memory : 32GB
GPU : NVIDEA Geforce GTX 1660 Ti 6GB ; Studiotreiber 31.0.15.3141
O/S Win 10

File from #20, Setting 100%
Estimated: 60 s
Actual 79 s


Quite ok for me for my "old" system.
With my 20 MB Fotos it last approx 20 s

Greetings from Vienna

Franz
 
I'm looking at upgrading my GPU in my PC to a RTX 4070 Ti and found this post and forum while researching.
It would be super if you do upgrade to the RTX 4070Ti if you could post the performance figures using the sample image referenced on post #20.

It would give a good indication of performance related not only to the GPU but also a good CPU reference, At some stage I might upgrade to the Ryzen 9 5950x or 5900x ss that will probably my last option before upgrading the motherboard to AM5 . I am not under pressure to upgrade now.. but very keen to find how the relative cpus impact performance.
 
It would be super if you do upgrade to the RTX 4070Ti if you could post the performance figures using the sample image referenced on post #20.

It would give a good indication of performance related not only to the GPU but also a good CPU reference, At some stage I might upgrade to the Ryzen 9 5950x or 5900x ss that will probably my last option before upgrading the motherboard to AM5 . I am not under pressure to upgrade now.. but very keen to find how the relative cpus impact performance.
It would help if I used that photo haha

PC
CPU : AMD Ryzen 9 5900X
Memory : 32GB
GPU : GTX 1070Ti 8GB
O/S Win 10

Denoise : Est 60 secs Actual 108 secs

Laptop
CPU : i7-11800H 8-Core
Memory : 32GB
GPU : RTX 3060 6GB
O/S Win 10

Denoise : Est 30 secs Actual 37 secs

I thought about doing a new build but everything was expensive.
 
@Linwood Ferguson, maybe start a new thread with your collected performance data, to make it easier to find and call out?
I could, though I was also reluctant to split it up.

I do think we've learned a lot (like buy RTX not GTX).
 
I could, though I was also reluctant to split it up.

I do think we've learned a lot (like buy RTX not GTX).
Did I miss something? Was there a reference photo that everyone could use to generate their results?
 
Adobe and DxO have found bugs in the Neural Engine so are unable to use it. DxO PhotoLab up through Monterey could use it, but they found a bug over 6 months ago in Ventura so they warn people to use the much slower GPU. Adobe has also found a bug (unknown if it is the same as the one DxO found or another one) so they have disabled the code in Lightroom that uses the Neural Engine for all versions of MacOS. Both Adobe and DxO say they have been waiting for Apple to fix the problems. There have been 6 Ventura updates since DxO found the problem, but none of those updates included a fix from Apple.

Adobe info concerning the Neural Engine problem(s):

Eric Chan, one of the longtime senior Adobe software engineers, wrote this article about Denoise AI:

Denoise demystified

https://blog.adobe.com/en/publish/2023/04/18/denoise-demystified

Finally, we built our machine learning models to take full advantage of the latest platform technologies, including NVIDIA’s TensorCores and the Apple Neural Engine. Using these technologies enables our models to run faster on modern hardware.

But, then Ian Lyon wrote:

https://community.adobe.com/t5/lightroom-classic-discussions/denoise-ai-in-12-3/m-p/13739400

Currently, the Apple Neural Engine found in M1/M2 silicon Macs is not used by Denoise. It's expected that performance will be even better when Denoise does use the Apple Neural Engine.

There is no change to above comments regarding the use of Tensor cores in NVidia cards. That is, Tensor cores are used by Denoise.


Apparently, an issue on Apple side means that the Neural Engine is not used by Adobe Denoise. When the issue is addressed, then Adobe Denoise is ready to take advantage of the Neural Engine.

And then Rikk Flohr confirmed it:

Ian Lyons information is correct regarding the Neural Engine on Apple devices

Rikk Flohr - Customer Advocacy: Adobe Photography Products

Here is what DxO tells their users (see red notice at bottom):
Screenshot 2023-05-02 at 14.30.00.jpg


PC/Nvidia//Windows users do get the full performance from Lightroom/Photoshop and DxO PhotoLab. Unfortunately, us Mac users get much lower performance because the Neural Engine cannot be used anymore. Apple Silicon DxO users report that using the Neural Engine is at least 4 times faster than using the GPU. DxO DeepPrime XD and Lightroom Denoise AI need lots of computing power so not being able to use the very fast Neural Engine is a big disadvantage for Mac users.
 
My elapsed time with the supplied image was estimated 1 minute, actual 1:27. PC has 32G RAM, nvme 2.0 drives, 5700G cpu and GTX-1660 Super gpu. Monitor is 2560 pixels wide. I also ran it thru DXO Photolab 6 at default DeepprimeXD, which took 1:12 including the return to LR. The DXO needs more investigation, though, because the result looked awful at 100% magnification - not at all what I would have expected. Thank you for this thread. I have been wondering what causes Enhance NR to be so slow and this discussion has provided the best info I have found thus far. I think I'll be upgrading my gpu sometime soon.
 
Please disregard the above post. Unfortunately, one gets only 10 minutes to write and edit and/or delete posts. That is no where long enough to write and check posts with much detail. I'll have to keep in mind to write complicated posts off line first.
Thank you. Your post implies that there is a lot which is complicated when it comes to this subject. However your explanation defining the differences was clear. Now I understand something more about the topic.
 
Thank you. Your post implies that there is a lot which is complicated when it comes to this subject. However your explanation defining the differences was clear. Now I understand something more about the topic.
Thanks! This is what I had intended to write above.

I'm no computer engineer but I'll have a go in simplistic terms. As I see it, it's about technology first then grunt power. The "RT" in RTX we can assume refers to Ray Tracing Cores, a sophisticated rendering technology to improve gaming imagery. The GTX doesn't have any RT cores. However, I believe the more important factor for us are Tensor Cores. These relate to AI technology. All the RTX cards I've looked at have tensor core, the GTX cards don't. The following compares the technologies of two popular cards:
GTX1650​
RTX2060​
Ray Tracing cores
0​
30​
Tensor cores
0​
240​
Shading cores
896​
1920​
Texture Mapping units
56​
120​
Raster Operations Pipeline
31​
48​
Don’t ask me what these things are but clearly the RTXs are a cut above in technology.
Incidentally, the Intel Arc A770 has 512 tensor cores but unfortunately, there is still an unresolved bug with running AI Denoise on it.
 
Data to date, updated 4/28/2023. I'll try to update this note as new info comes in. Corrections solicited, will leave in the order received for now so you can match up.
Please annotate my Macbook Pro entry to indicate it's an M2 Max chip (the different Apple Silicon chips have significantly different clock speeds and graphics processors).
 
Windows 10
Intel i9-9900KF
32gb RAM
Nvidia GeForce 1650 OC (4gb)
Sample provided = 60mb
Estimated time = 4 minutes
Actual time = 7minutes 20 seconds

Performance monitor showed GPU utilization never more than about 50%, about 8gb memory and 15% max CPU utilization

For comparison with a photo of my own = 32mb:
Estimated time = 4 minutes
Actual time approx 4 minutes

The only questions now are:
1. do I have enough high ISO pics that need processing right now that makes it worth spending around $550 AUS?
2. when I do upgrade the GPU, to go for 8gb or 12gb (it will have to be the RTX 3060 if I don't want to change anything else such as power supply - am halfway through the life for my computer, and deliberately chose a cheaper GPU, so did expect to upgrade sometime)

Many thanks to Gnits and other contributers for this topic!
 
My instinct is go for 12gb now if you have the opportunity.

There are daily announcements of new RTX40xx models hitting the market. I would wait a little while until the 40xx models are generally in stock and then check the relative prices of the 30xx and 40xx models.

For me… I upgraded to improve overall performance of Lr, the arrival of Denoise feature was a happy accident and a feature I rarely use, but I am v happy with the boost in overall Lr interactivity.
 
For me… I upgraded to improve overall performance of Lr, the arrival of Denoise feature was a happy accident and a feature I rarely use, but I am v happy with the boost in overall Lr interactivity.
Just wait until the announcement of LrC 13 in October.
 
Windows 10
Intel i9-9900KF
32gb RAM
Nvidia GeForce 1650 OC (4gb)
Sample provided = 60mb
Estimated time = 4 minutes
Actual time = 7minutes 20 seconds

Performance monitor showed GPU utilization never more than about 50%, about 8gb memory and 15% max CPU utilization

For comparison with a photo of my own = 32mb:
Estimated time = 4 minutes
Actual time approx 4 minutes

The only questions now are:
1. do I have enough high ISO pics that need processing right now that makes it worth spending around $550 AUS?
2. when I do upgrade the GPU, to go for 8gb or 12gb (it will have to be the RTX 3060 if I don't want to change anything else such as power supply - am halfway through the life for my computer, and deliberately chose a cheaper GPU, so did expect to upgrade sometime)

Many thanks to Gnits and other contributers for this topic!
1. That's a pragmatic point of view, like, "Can Denoise make my high ISO images saleable?". As an enthusiast, my view is quite different. Poor images I discard regardless. However, an image that particularly pleases me, I can't help thinking, "How much better would it be if I Denoised it first?" :)
2. Any RTX GPU will be a great improvement over what you have.
 
Status
Not open for further replies.
Back
Top