My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)

Status
Not open for further replies.

Gnits

Matt O’Brien
Premium Classic Member
Joined
Mar 21, 2015
Messages
3,673
Location
Dublin, Ireland.
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic 12.2.1
Operating System
  1. Windows 10
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

"This is a relatively long discussion. It is worth reading the discussion in sequence, as posted."

To assist people returning to this discussion, the following are direct links to frequently used posts.

Post #20 contains a link to a raw file to use, if you wish to compare the performance of your rig to others.
https://www.lightroomqueen.com/comm...e-nvidia-rtx-4070-ti-12gb.47572/#post-1315509

Post #30 contains a summary of Denoise performance stats for a range of GPU’s and CPU’s from readers of this post.
https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I am posting this on the Lightroom Classic forum, as I think it is most relevant to Classic users.

With GPU prices softening and the release of the 4xxx range I decided to purchase the Gigabyte Nvidia RTX 4070 Ti 12GB. I was aware, before purchase that I needed a min 750 Watt GPU. I already had a recently purchased Corsair RM850X model and plenty of spare Corsair supplied compatible internal power cables.

The GPU was tricky to install because of its size and the need to install an awkward bracket support system. The GPU was supplied with a Y cable, where the single end was plugged into the GPU and the two other ends plugged into separate power cables (not supplied by Gigabyte) which were then plugged into the PSU. I used Corsair cables, supplied with my PSU.

I triple checked that I had the polarity of all the cable connections correct, square pins fitted into square holes and rounded pins fitted into rounded holes etc. When I powered up my PC….. nothing happened. Nothing. No lights, no fans starting up, no status numbers on the internal motherboard display. Totally dead... and no clue as to what the problem might be. I feared the worst.

I completely disassembled my PC and removed the Corsair power supply. I then followed the Corsair instructions on how to test the PSU. All tests passed and all (28 I think) pin outs had exactly the correct voltage settings.

I rebuilt my rig, this time reinstalling my old GPU. With much relief, it booted perfectly. I was worried that many of my main components, such as motherboard, processor, PSU, etc had been toasted.

I sent a query to Gigabyte, requesting advice and trouble shooting steps, detailing the steps I had taken to test my PSU and my overall rig.

At the same time, I started researching any known issues with this range of GPU’s. I was horrified to discover many incidents where GPU’s, cables and other components had melted following installation of 4xxx GPU’s. My heart sunk. Many of the horror stories pointed to the supplied Y cable as the source. GPU makers pointed towards incorrectly seated motherboards, gpus and or cables. I can confirm that I had triple checked my connections and listened in each case for the ‘click’ as firm connections were made.

I ordered a Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power cable , directly from Corsair. It was been shipped from France and would take a week.

6 days passed and two events occurred on the same day. 1). My Corsair 12VHPWR cable arrived and 2). I received a response from Gigabyte. Essentially, Gigabyte told me to check my GPU (which I had already told Gigabyte I had done) or contact the supplier of the GPU (Amazon)).

So, I installed the Corsair 12VHPWR cable I had just purchased, connecting the GPU to the power supply. There was no need for me to triple check all the connections as the connectors would only connect in the correct way. I gingerly turned on power to the PSU and pressed the start button on the PC….. … the fan on the PSU slowly started to turn …. lights appeared within the GPU, numbers started to appear on the motherboard led display and the boot sequence started.

My PC…came to life… with the sound of the 'Hallelujah Chorus' reverberating in the background.



My Summary.

1. The response from Gigabyte was totally unacceptable.
2. In my opinion, the Y cable supplied by Gigabyte was inappropriate for the intended purpose.
3. Gigabytes response was seriously underwhelming and totally ignored the elephant in the room… ie the Y power cable.

My advice to anyone installing a modern, fairly high powered GPU is to triple check the connections needed to support the GPU and procure a single, fit for purpose, cable assembly which connects the GPU directly to the PSU, without any 3rd party connectors. Make sure you have a modern PSU of sufficient wattage and make sure you have sufficient spare slots in your PSU to cater for twin connections if required to power your GPU.

Ps. I have noticed a very pleasing improvement in Lr responsiveness…... very timely… as the third event which occurred on the same day was the release of a new Lr version, with the new GPU focused noise reduction and mask features. Serendipity ??? ‘Sweet’.

I am posting this story in the hope that others will not have to experience the pain and anguish I have done and help avoid a potential catastrophic meltdown of lots of very expensive components. I know every config is different ... but the bottom line is that modern GPUs are power hungry and need cables fit for purpose.

Here is the cable I purchased directly from Corsair. In my opinion, this should be supplied with GPU's which need dual PSU power slots.

CorsairPsuGPUCable.jpg
 
Last edited by a moderator:
The i9-12900K has the same iGPU as my i7- 12700K. I have no discrete GPU and 32GB of RAM. Denoise runs entirely on the iGPU. It takes 4min 10sec on a 20MB file. I'd be interested to know how long your machine would take to run Denoise just on the iGPU. Only if you're inclined to do so, you should be able to easily select GPUs in Edit>Preferences>Performance. I totally understand if you'd rather not try it.
Hi Bob,
It's not currently possible to test CPU only without deconstructing the machine unfortunately, the Ai Denoise will utilize the add-on GPU regardless of the GPU use options selected in preferences, this behavior is confirmed as expected with my contacts at the dev team. I would be keen to know how the CPU compares too as my GPU putts along barely above idle as the Ai NR is flogging the GPU at 99% load.
Cheers, Rob
 
Hi Bob,
It's not currently possible to test CPU only without deconstructing the machine unfortunately, the Ai Denoise will utilize the add-on GPU regardless of the GPU use options selected in preferences, this behavior is confirmed as expected with my contacts at the dev team. I would be keen to know how the CPU compares too as my GPU putts along barely above idle as the Ai NR is flogging the GPU at 99% load.
Cheers, Rob
Understood! Yes, much the same with me. Running Denoise, the CPU does barely anything while the iGPU runs at near 100%.
The reason I asked is that I thought if I upped the CPU and RAM to what you have, would I get a performance boost enough to avoid having to buy an RTX of some sort, while also getting the general benefits of a faster CPU. I'm pretty sure the answer is no.
 
Understood! Yes, much the same with me. Running Denoise, the CPU does barely anything while the iGPU runs at near 100%.
The reason I asked is that I thought if I upped the CPU and RAM to what you have, would I get a performance boost enough to avoid having to buy an RTX of some sort, while also getting the general benefits of a faster CPU. I'm pretty sure the answer is no.
I would assume not, the new AI code seems optimized to utilize the Tensor cores in the add-on GPUs, which makes sense. I hope Puget Systems update their Adobe GPU tests soon to reflect the performance effects of GPU vs the new AI features.
 
It's not currently possible to test CPU only without deconstructing the machine unfortunately, the Ai Denoise will utilize the add-on GPU regardless of the GPU use options selected in preferences, this behavior is confirmed as expected with my contacts at the dev team. I would be keen to know how the CPU compares too as my GPU putts along barely above idle as the Ai NR is flogging the GPU at 99% load.

On Windows and Intel Macs, you can easily force LR to use the CPU for all AI commands (including Denoise) by restarting in Windows or Mac safe mode, which causes the operating system to disable the GPU. (This doesn't work on Apple Silicon Macs.)

A couple of tests Linwood Ferguson and I have done show a 50x Denoise slowdown from a newer GPU to the CPU. Other tests I've done show only a 5x slowth for AI masking.

Adobe has stubbornly and inexplicably resisted adding an option that would let you disable the use of the GPU for AI commands, even though all the other ways that LR uses the GPU have such options:
https://community.adobe.com/t5/ligh...disable-the-gpu-for-ai-masking/idi-p/12480845

This would be trivial to implement, since the AI libraries LR uses to run their AI models (Apple's CoreML and the open-source ONNX Runtime on Windows) allow for models to be run on the CPU or the GPU. It would allow those on older computers to use AI masking (though Denoise would intolerably slow), and it would make troubleshooting so much easier.
 
On Windows and Intel Macs, you can easily force LR to use the CPU for all AI commands (including Denoise) by restarting in Windows or Mac safe mode, which causes the operating system to disable the GPU. (This doesn't work on Apple Silicon Macs.)

A couple of tests Linwood Ferguson and I have done show a 50x Denoise slowdown from a newer GPU to the CPU. Other tests I've done show only a 5x slowth for AI masking.
That's an option that I hadn't considered, clumsy but will provide some indication (sans all potential driver optimization/acceleration). I might give it a go but it would still be some effort as I would need to set up an alternate profile with a LR install so as not to trash my working desktop as the generic Plug n Play video driver supports a maximum resolution of 1024x768.
 
I needed to pay back for the useful info on this thread so I finally downloaded the sample arw file and ran LRC Denoise AI with Amount=50

CPU = Intel Core i5-4570 @ 3.2GHz
RAM = 16GB
GPU = NVIDIA GeForce GT1030
VRAM = 2GB
OS = Win10
Denoise est = 45min
Denoise actual = 35min

OW! Which is why I'm looking at a new PC. Currently thinking of either a Mac Mini M2 Pro or a custom PC.
I'd be very grateful to see some timing from a Mac Mini M2 Pro. If someone does report on that, would they please distinguish which version of the M2 Pro they used, i.e. the 10 core CPU/16 core GPU or the 12 core CPU/19 core GPU. I'd also be interested to see if there are any differences which might be attributed to VRAM size on the graphics card in the case of Windows PCs. I'm particularly interested in the RTX 3060 variants. There's an 8GB and 12GB VRAM version as well as the Ti version. I'd like to think the 12GB version might provide some performance bonus as per..

https://www.techspot.com/review/2581-nvidia-rtx-3060-8gb/
https://techuda.com/rtx-3060-12gb-vs-8gb-which-one-to-buy/

I'm also keeping my eye on the Intel A770 card which appears to be good value, if only there's wasn't a bug affecting LRC Denoise.
 
I'm also keeping my eye on the Intel A770 card which appears to be good value, if only there's wasn't a bug affecting LRC Denoise.
Me too!
 
I said…. “and M3 stats at some stage.”
Should be interesting when they do arrive…..
 
I know there's already a data point for this, but I got a chance to try the sample image on a Mac mini M1 8GB box....

CPU = M1 8 CPU (4 performance, 4 efficiency)/8 GPU
RAM = 8GB
GPU = embedded
VRAM = N/A
OS = Mac OS Ventura
Denoise est = 3min
Denoise actual = 2min 10sec
 
Here's a CPU/GPU combo I don't think has been reported on:

CPU = Intel 13600K
RAM = 32 GB DRR4
GPU = RX 570
VRAM = 8 GB GDDR5
OS = Windows 11
Denoise est. = 2 min
Denoise Actual = 151 sec.

I noticed from Task Manager that my CPU never went above 1% usage and while the GPU ran near 100% the whole time, it never used more than 3.8 GB of its available 8 GB.
 
Last edited:
Here's my second PC:

CPU = Intel 12100F
RAM = 16GB
GPU = RTX 2026
VRAM = 6 GB
OS = Windows 11
Denoise est. = 40 sec.
Denoise Actual = 48 sec.

I built this entire PC last November for less than I was planning on spending on a new GPU for my "main" PC to get AI Denoise to run 30 sec or faster. This PC was also built in Nov. of last year, but with the RX570 and a few other components from a 2019 build. I paid $180 new for the 2060 from Amazon Nov. 28th - they're now ~$385!. Maybe I should have bought 2 of them.
 
An update to my abysmal result with the GTX 1050 TI.

It was a toss up between upgrading to an RTX 3060 or the TI version, but in the end I decided on the latter with its 35% extra tensor cores, faster memory etc. I got a very good deal on a new MSI RTX 3060 TI OC (factory over-clock) and the faster GDDR6X memory and it arrived today.

CPU: Ryzen 5900x
RAM: 32 GB DDR4 3600
GPU: RTX 3060 TI OC 8GB GDDR6X
OS: Windows 11 Pro

The results with Linwood's test image:
Predicted: 20 seconds
Actual: 23 seconds

On my Sony A7iv 33mp files:
Predicted: 10 seconds
Actual: 13 seconds

Needless to say, I am thrilled with the result as my own images with the 1050 TI were taking 4 minutes!
 
I have replaced my GPU. Previously, I had an Nvidia GTX 1070, now I have an RTX 2060 Super. Big difference! An RTX series GPU is a huge improvement over a GTX series.


Here is the current information:

Windows 11
Intel i5-9600
32gb RAM
Nvidia GeForce RTX 2060 Super

Estimated time= 30 seconds
Actual time = 35 seconds


Previously, I posted the following results:

Windows 11
Intel i5-9600
32gb RAM
Nvidia GeForce GTX 1070

Estimated time= 2 minutes
Actual time = 1 minute 49 seconds
 
Here are my results using the test photo:
Windows 11
Intel i5-12400F
16 GB RAM
Nvidia GeForce RTX 3060 12 GB
Estimated time 30 s
Actual time 41 s
 
Here are my results using the test photo:
Windows 11
Intel i5-12400F
16 GB RAM
Nvidia GeForce RTX 3060 12 GB
Estimated time 30 s
Actual time 41 s
Yep, those tensor cores will do it. 41 seconds is a good result but I think others with the same GPU are getting better results. Maybe more RAM would help as well.
 
Computer - i7 8700 with 32G ram
Nvidia Geforce 3060 TI
estimated time was 20 seconds for denoise actual time was 24 seconds.

This is the same computer that I used with an Nvidia Geforce 1060 which took 159 seconds to denoise the sample image.

For my 26 MP canon RP images the estimated time to denoise is 9 seconds and actual time is 10 seconds.

For both images (the test image and my images) I am looking at a 6.6-fold improvement in processing times.
 
Here are my results using the test photo:
Windows 11
Intel i5-12400F
16 GB RAM
Nvidia GeForce RTX 3060 12 GB
Estimated time 30 s
Actual time 41 s
I agree with Bob that you should have faster times. I have a slower processor than you and a slower gpu and my actual time is faster than your results. But, I personally would not spend the money to double my RAM just to gain a few seconds because in reality I am not going to use DeNoise a lot in my opinion.
 
Bob and Bryan. Thanks for the comments. For now I am happy with the results, my previous PC with on board GPU took 15 min(!) to denoise my Canon FF images. Now it takes 15 seconds. That's fast enough for me.
 
Estimated time 2 minutes, Measured time 121 sec.

CPU Intel 9750H
GPU Nvidia GTX 1660 TI, 6GB DDR5
RAM 32 GB
Lenovo Legion
I see that my measured time is significant longer than others with the same GPU :-o Still I'm happy with the tool.
 
Gnits, and now I read the new 4090 cards have a 16 PIN power need. Wild..

As for speed of Denoise. With only CPU on, a one-minute job, the GPU can do it in 5 - 7 seconds. I only have a 2080 Ti.
 
I may be dreaming this but after the 12.4 update mu time has dropped from 45 seconds to 36.
 
Mine stayed at 87 seconds. Yours might improve if Adobe figured out an optimisation for your particular GPU.
 
Status
Not open for further replies.
Back
Top