• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • 25 June 2024 It's Lightroom update time again! See What’s New in Lightroom Classic 13.4, Mobile & Desktop (June 2024)? for the bug fixes and new lens profiles. Warning... this is a minor dot release and does NOT fix some of the bugs introduced in 13.3, so if you reverted to 13.2 due to bugs (e.g. sync issues), we'd suggest you stay there for now.

My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)

Status
Not open for further replies.

Gnits

Senior Member
Joined
Mar 21, 2015
Messages
2,531
Location
Dublin, Ireland.
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic 12.2.1
Operating System
  1. Windows 10
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

"This is a relatively long discussion. It is worth reading the discussion in sequence, as posted."

To assist people returning to this discussion, the following are direct links to frequently used posts.

Post #20 contains a link to a raw file to use, if you wish to compare the performance of your rig to others.
https://www.lightroomqueen.com/comm...e-nvidia-rtx-4070-ti-12gb.47572/#post-1315509

Post #30 contains a summary of Denoise performance stats for a range of GPU’s and CPU’s from readers of this post.
https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I am posting this on the Lightroom Classic forum, as I think it is most relevant to Classic users.

With GPU prices softening and the release of the 4xxx range I decided to purchase the Gigabyte Nvidia RTX 4070 Ti 12GB. I was aware, before purchase that I needed a min 750 Watt GPU. I already had a recently purchased Corsair RM850X model and plenty of spare Corsair supplied compatible internal power cables.

The GPU was tricky to install because of its size and the need to install an awkward bracket support system. The GPU was supplied with a Y cable, where the single end was plugged into the GPU and the two other ends plugged into separate power cables (not supplied by Gigabyte) which were then plugged into the PSU. I used Corsair cables, supplied with my PSU.

I triple checked that I had the polarity of all the cable connections correct, square pins fitted into square holes and rounded pins fitted into rounded holes etc. When I powered up my PC….. nothing happened. Nothing. No lights, no fans starting up, no status numbers on the internal motherboard display. Totally dead... and no clue as to what the problem might be. I feared the worst.

I completely disassembled my PC and removed the Corsair power supply. I then followed the Corsair instructions on how to test the PSU. All tests passed and all (28 I think) pin outs had exactly the correct voltage settings.

I rebuilt my rig, this time reinstalling my old GPU. With much relief, it booted perfectly. I was worried that many of my main components, such as motherboard, processor, PSU, etc had been toasted.

I sent a query to Gigabyte, requesting advice and trouble shooting steps, detailing the steps I had taken to test my PSU and my overall rig.

At the same time, I started researching any known issues with this range of GPU’s. I was horrified to discover many incidents where GPU’s, cables and other components had melted following installation of 4xxx GPU’s. My heart sunk. Many of the horror stories pointed to the supplied Y cable as the source. GPU makers pointed towards incorrectly seated motherboards, gpus and or cables. I can confirm that I had triple checked my connections and listened in each case for the ‘click’ as firm connections were made.

I ordered a Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power cable , directly from Corsair. It was been shipped from France and would take a week.

6 days passed and two events occurred on the same day. 1). My Corsair 12VHPWR cable arrived and 2). I received a response from Gigabyte. Essentially, Gigabyte told me to check my GPU (which I had already told Gigabyte I had done) or contact the supplier of the GPU (Amazon)).

So, I installed the Corsair 12VHPWR cable I had just purchased, connecting the GPU to the power supply. There was no need for me to triple check all the connections as the connectors would only connect in the correct way. I gingerly turned on power to the PSU and pressed the start button on the PC….. … the fan on the PSU slowly started to turn …. lights appeared within the GPU, numbers started to appear on the motherboard led display and the boot sequence started.

My PC…came to life… with the sound of the 'Hallelujah Chorus' reverberating in the background.



My Summary.

1. The response from Gigabyte was totally unacceptable.
2. In my opinion, the Y cable supplied by Gigabyte was inappropriate for the intended purpose.
3. Gigabytes response was seriously underwhelming and totally ignored the elephant in the room… ie the Y power cable.

My advice to anyone installing a modern, fairly high powered GPU is to triple check the connections needed to support the GPU and procure a single, fit for purpose, cable assembly which connects the GPU directly to the PSU, without any 3rd party connectors. Make sure you have a modern PSU of sufficient wattage and make sure you have sufficient spare slots in your PSU to cater for twin connections if required to power your GPU.

Ps. I have noticed a very pleasing improvement in Lr responsiveness…... very timely… as the third event which occurred on the same day was the release of a new Lr version, with the new GPU focused noise reduction and mask features. Serendipity ??? ‘Sweet’.

I am posting this story in the hope that others will not have to experience the pain and anguish I have done and help avoid a potential catastrophic meltdown of lots of very expensive components. I know every config is different ... but the bottom line is that modern GPUs are power hungry and need cables fit for purpose.

Here is the cable I purchased directly from Corsair. In my opinion, this should be supplied with GPU's which need dual PSU power slots.

CorsairPsuGPUCable.jpg
 
Last edited by a moderator:
Just tested mine with the sample image provided:

12600k 32 MB ram RX 580 8GB Est 2 min Actual 2:10

Hopefully updated results with an RTX 3070 tomorrow!
 
RTX 3070 up and running and:

12600k 32 MB ram RTX 3070 Est 20 sec min Actual 24 sec (average of three runs).

Masking is a bit quicker and smoother with large files also. I think the 3070 has been a good choice for me; was tempted by the 4070, but got a good used 3070 for half the price so think I have found a good balance.
 
I am extremely pleased to see these results posted. When I purchased my 4070Ti ... I was completely in the dark and may have opted for a 3 series card if I had access to the real world experiences posted here. Hopefully, I have at least given myself some future proofing. Also, I was so worried about power issues, power cables, etc...

I would urge caution for anyone with a power port on their gpu similar to the image posted at #1 to check they have a robust single cable from the gpu to the power supply and not a collection of cables cobbled together.
 
I am extremely pleased to see these results posted. When I purchased my 4070Ti ... I was completely in the dark and may have opted for a 3 series card if I had access to the real world experiences posted here. Hopefully, I have at least given myself some future proofing. Also, I was so worried about power issues, power cables, etc...

I would urge caution for anyone with a power port on their gpu similar to the image posted at #1 to check they have a robust single cable from the gpu to the power supply and not a collection of cables cobbled together.
That nature of these things is that someone always has to be the first person through the door, and I for one thank you for doing so with the 4070ti and for starting this thread! The 4070ti is a cracking card, and whilst I suspect we will see bigger gen-on-gen AI increases in the 5000 series cards whenever they release, I'm sure the 4070ti will hold it's down fine!
 
Just come across this thread and thought I would give it a try. I bought a new PC just before the AI features appeared in Lightroom CC and unfortunately underspecified the graphics card as wasn't used much prior to these features.
Estimated time to Denoise 7 mins actual 7:40 min
Ryzen 5700X 64GB ram DDR3 @3220mhz GTX 1650
 
Thought I'd add my recent experience. I'm 81, on a budget, just built a PC with i5 12600, ASUS B760 motherboard, and a used RTX 3060 (non-ti). In my first test, Denoise AI took 10-12 seconds to process a Canon R6 20-mpx file. Very relieved, as I often batch process 200-300 pics in LR that are shot indoors with ISOs ranging from 1250 to 12000. Denoise AI promises to be immensely helpful - see a sample before and after pic here, from our school's science fair last spring. The photo was shot with the R6 at ISO 20000, and the "after" pic looks to my ancient eyes like it could have been shot at ISO 100. https://www.lwsphotos.com/Misc-Pix/Lightroom-noise-red-AI/
 
My compliments to you on several fronts, you have just ticked so many boxes. Building such a tidy rig to meet your needs and getting a decent gpu installed will probably inspire lots of others to tackle moving from 1st generation GPU's.

I am pleased to see practical progress in relation to use of AI for post processing, especially as sensor sizes continue to grow so large.

I am also pleased that there is a lot more accumulated experience in relation to relative GPU performance as it relates to Lr, so a big thanks for adding to the collective wisdom.
 
My compliments to you on several fronts, you have just ticked so many boxes. Building such a tidy rig to meet your needs and getting a decent gpu installed will probably inspire lots of others to tackle moving from 1st generation GPU's.

I am pleased to see practical progress in relation to use of AI for post processing, especially as sensor sizes continue to grow so large.

I am also pleased that there is a lot more accumulated experience in relation to relative GPU performance as it relates to Lr, so a big thanks for adding to the collective wisdom.
Thanks for your kind note, Gnits. I designed the new PC specifically with LR Denoise AI in mind. I considered simply swapping for a Mac Mini M2 after my former PC died of old age, but I was put off by the delays and unanswered questions surrounding Apple and Adobe's efforts to resolve the M2 neural engine bugs that prevent MacOS from giving decent Denoise AI processing times. (I hate to be held captive by a single-source provider with a soft record of listening to its customers' complaints. Being able to upgrade and swap components freely is so liberating.) I felt that upgrading the GPU was worth the expense, since I shoot lots of school events, and parents ARE pixel peepers - if there is ugly grain in little Sally and Bobby's faces, the parents will notice. :)
 
You might be interested in the following quote from a Reuters article today. It suggests Nvidia has things pretty well sown up in the AI field. The takeaway is we can't expect any serious competition any time soon by way of AI capable GPUs. The only serious competition is Intel which seems to be more games oriented and has issues with AI Denoise anyway.
"Sept 11 (Reuters) - Nvidia's (NVDA.O) supremacy in building computer chips for artificial intelligence has chilled venture funding for would-be rivals, investors said, with the number of U.S. deals this quarter falling 80% from a year ago.
The Santa Clara, California company dominates the market for chips that work with massive amounts of language data. Generative AI models get incrementally smarter through exposure to more data, a process called training.
"
 
"Sept 11 (Reuters) - Nvidia's (NVDA.O) supremacy in building computer chips for artificial intelligence has chilled venture funding for would-be rivals, investors said, with the number of U.S. deals this quarter falling 80% from a year ago.
That's the same effect that Microsoft had maybe 20 years ago for product areas like productibity (MS Office), if I remember correctly. A friend went to an interview with a startup which was going to compete with Excel, but he told them outright that they could never succeed. He was right.
 
As a thank you to everyone who posted here. I'm a Photoshop ACR user rather than Lightroom but here's the Denoise result on the standard Sony basketball photo on my 2020 iMac 27"

38 seconds

iMac (Retina 5K, 27-inch, 2020)
3.8 GHz 8-Core Intel i7
40 GB 2133 MHz DDR4
AMD Radeon Pro 5700 XT 16 GB

I was debating buying an M2 Max Studio, but based on what I'm reading here I don't think it's worth it. I saw the post from someone who had an 2023 MacBook Pro with and M2 Max of 17 seconds for the Sony basketball image. I'm working on 6000x4000 Canon R3 images similar to the that poster's Canon R6 II images. Dropping from 18 seconds per image to 10 seconds per images isn't enough of a change to justify buying a $2800 computer and a $1600 Apple monitor. The math that I ran when I bought the refurb 2020 in 2021 is still holding up. I'll wait another generation. I don't need the 4k video editing right now. That might be my only pressure point on upgrading.
 
New to the forum, but I appreciate the effort and organization charting progress and discussion on this issue.
Today's input from me:
I upgraded a NVidia RTX 2070 Super (8GB VRAM) to the NVidia Founders Edition RTX 4090 (24GB VRAM)
LR 12.x Denoise @ 50 run on Sony test image copies in batches of 5, 5 runs averaged, stopped the clock when the 5th DNG hit the filmstrip.
Estimated time 1 min, actual ~36.5s Average per image Denoise time 7.3s.

I will be processing my 45MP Nikon NEF files, so expecting 5.5 seconds or so per file.
The difference from a 20 second plus operation to a 5 second operation means I can keep Denoise as part of my interactive flow, and not be interrupted by a distracting delay, or have to preselect images and run a batch.
 
Great to see the results of this high end combo. It helps put the wider range of gpus into relative perspective.

Thanks for posting.
 
I did indeed use the downloaded basketball court SONY ARW image to make my results comparable.
I just did the batch and re-test to minimize any error in my timing. The filmstrip appears to take an extra second or more to update once the Denoise has completed, and 5-8 seconds it makes more difference than at 17 minutes (my friend's laptop).
 
It looks like the M3 MacBook PRO models will be announced next week by Apple. It will be interesting to see how much the M3 improves performance.
 
Out of curiosity have tested Ryzen 7 Pro 7840U. With estimated 6 minutes it took 173 seconds. Not too bad for an APU :confused: Almost as good as Nvidia 1060.
 
This is a veryresourceful thread for LR performance but I'd have a specific question in mind. As we all know, AI masking and denoise use GPU. Thing is...I dont really care about the denoise time per se...20 or 30 seconds to me is the same. I'd want to know what is the cheapest GPU I could afford to get the UI experience fluid. For me, its the masking part that became a nightmare when I upgraded LR to version 12. Its barely usable as just switching to this feature or even adjusting basic exposure is a pain. So upgrade time. Lesser $$ the better. Im even thinking about a 16gb mac mini for about 1000cad if it could match a 1600$ pc (rtx 4060).

Actual PC that ran great until 12.X: i5-6600, 16gb ram, gtx 750ti
 
Unfortunately, GPU is not the only variable re performance. I have the RTX 4070Ti 12GB, really fast internal bandwidth, very recent motherboard, 64 GB of fast memory and the fastest M2 drives I could find. I am still unhappy with the typical workflow response times in Library and Develop modes. The pivot point for me was upgrading from 42MB a7r3 to the 60 GB a7rv. I would have preferred a 40MB a7rv, but that was not an option.

It seems that a CPU with fast prime core rather than a lot of cores is a key factor on Win platforms. Also, people are using much bigger screens, so screen real estate pixels need to be rendered substantially faster than 1k and 2k screens.

It is very difficult to find the magic key.
 
Unfortunately, GPU is not the only variable re performance. I have the RTX 4070Ti 12GB, really fast internal bandwidth, very recent motherboard, 64 GB of fast memory and the fastest M2 drives I could find. I am still unhappy with the typical workflow response times in Library and Develop modes. The pivot point for me was upgrading from 42MB a7r3 to the 60 GB a7rv. I would have preferred a 40MB a7rv, but that was not an option.

It seems that a CPU with fast prime core rather than a lot of cores is a key factor on Win platforms. Also, people are using much bigger screens, so screen real estate pixels need to be rendered substantially faster than 1k and 2k screens.

It is very difficult to find the magic key.
Fully agree, however there is a caveat. The catch cry has been, more memory, more performance. I have a year old MSI Z690 motherboard. It has four memory slots. The manual warns that, filling four slots, runs slower than filling two. So, one could max out the memory but have a slower machine. Of course, much depends on the application. I have two slots filled with 32GB of RAM. All four slots could be filled with 62GB of RAM. Would LrC run faster or slower? Don't know but since it runs perfectly as is, it's not worth the money to find out. I'm just warning of a counterintuitive limitation on some motherboards.
 
I too have 4 memory slots, but have deliberately only populated 2 with 2x32GB. I would install another 2x32GB in a heartbeat, but I strongly sense that Lr is not using the existing memory.
 
Unfortunately, GPU is not the only variable re performance.

Also, people are using much bigger screens, so screen real estate pixels need to be rendered substantially faster than 1k and 2k screens.
If you upgrade to a larger screen with say 4K resolution, you may need to upgrade your video card. Lots of variables.
 
And one more test. This time my old laptop with Ryzen 7 Pro 4750U with Vega 7.
Took 8 min 38 seconds.
Pretty impressive difference between Ryzen 4000 and 7000. 3 times!
 
A recent result reported for the 7900XTX , AMD's latest and fastest GPU priced around $1,000 runs Denoise on 24MP in 7s, or normalized for the 61MP comparison, that's 19s.
 
Status
Not open for further replies.
Back
Top