• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • 14 October 2024 It's Lightroom update time again! See What’s New in Lightroom Classic 14.0, Mobile & Desktop (October 2024)? (includes fixes in 14.0.1) for Feature updates, new cameras and lenses, Tethering changes, and bug fixes.

My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)

Status
Not open for further replies.

Gnits

Matt O’Brien
Joined
Mar 21, 2015
Messages
2,971
Location
Dublin, Ireland.
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic 12.2.1
Operating System
  1. Windows 10
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

"This is a relatively long discussion. It is worth reading the discussion in sequence, as posted."

To assist people returning to this discussion, the following are direct links to frequently used posts.

Post #20 contains a link to a raw file to use, if you wish to compare the performance of your rig to others.
https://www.lightroomqueen.com/comm...e-nvidia-rtx-4070-ti-12gb.47572/#post-1315509

Post #30 contains a summary of Denoise performance stats for a range of GPU’s and CPU’s from readers of this post.
https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I am posting this on the Lightroom Classic forum, as I think it is most relevant to Classic users.

With GPU prices softening and the release of the 4xxx range I decided to purchase the Gigabyte Nvidia RTX 4070 Ti 12GB. I was aware, before purchase that I needed a min 750 Watt GPU. I already had a recently purchased Corsair RM850X model and plenty of spare Corsair supplied compatible internal power cables.

The GPU was tricky to install because of its size and the need to install an awkward bracket support system. The GPU was supplied with a Y cable, where the single end was plugged into the GPU and the two other ends plugged into separate power cables (not supplied by Gigabyte) which were then plugged into the PSU. I used Corsair cables, supplied with my PSU.

I triple checked that I had the polarity of all the cable connections correct, square pins fitted into square holes and rounded pins fitted into rounded holes etc. When I powered up my PC….. nothing happened. Nothing. No lights, no fans starting up, no status numbers on the internal motherboard display. Totally dead... and no clue as to what the problem might be. I feared the worst.

I completely disassembled my PC and removed the Corsair power supply. I then followed the Corsair instructions on how to test the PSU. All tests passed and all (28 I think) pin outs had exactly the correct voltage settings.

I rebuilt my rig, this time reinstalling my old GPU. With much relief, it booted perfectly. I was worried that many of my main components, such as motherboard, processor, PSU, etc had been toasted.

I sent a query to Gigabyte, requesting advice and trouble shooting steps, detailing the steps I had taken to test my PSU and my overall rig.

At the same time, I started researching any known issues with this range of GPU’s. I was horrified to discover many incidents where GPU’s, cables and other components had melted following installation of 4xxx GPU’s. My heart sunk. Many of the horror stories pointed to the supplied Y cable as the source. GPU makers pointed towards incorrectly seated motherboards, gpus and or cables. I can confirm that I had triple checked my connections and listened in each case for the ‘click’ as firm connections were made.

I ordered a Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power cable , directly from Corsair. It was been shipped from France and would take a week.

6 days passed and two events occurred on the same day. 1). My Corsair 12VHPWR cable arrived and 2). I received a response from Gigabyte. Essentially, Gigabyte told me to check my GPU (which I had already told Gigabyte I had done) or contact the supplier of the GPU (Amazon)).

So, I installed the Corsair 12VHPWR cable I had just purchased, connecting the GPU to the power supply. There was no need for me to triple check all the connections as the connectors would only connect in the correct way. I gingerly turned on power to the PSU and pressed the start button on the PC….. … the fan on the PSU slowly started to turn …. lights appeared within the GPU, numbers started to appear on the motherboard led display and the boot sequence started.

My PC…came to life… with the sound of the 'Hallelujah Chorus' reverberating in the background.



My Summary.

1. The response from Gigabyte was totally unacceptable.
2. In my opinion, the Y cable supplied by Gigabyte was inappropriate for the intended purpose.
3. Gigabytes response was seriously underwhelming and totally ignored the elephant in the room… ie the Y power cable.

My advice to anyone installing a modern, fairly high powered GPU is to triple check the connections needed to support the GPU and procure a single, fit for purpose, cable assembly which connects the GPU directly to the PSU, without any 3rd party connectors. Make sure you have a modern PSU of sufficient wattage and make sure you have sufficient spare slots in your PSU to cater for twin connections if required to power your GPU.

Ps. I have noticed a very pleasing improvement in Lr responsiveness…... very timely… as the third event which occurred on the same day was the release of a new Lr version, with the new GPU focused noise reduction and mask features. Serendipity ??? ‘Sweet’.

I am posting this story in the hope that others will not have to experience the pain and anguish I have done and help avoid a potential catastrophic meltdown of lots of very expensive components. I know every config is different ... but the bottom line is that modern GPUs are power hungry and need cables fit for purpose.

Here is the cable I purchased directly from Corsair. In my opinion, this should be supplied with GPU's which need dual PSU power slots.

CorsairPsuGPUCable.jpg
 
Last edited by a moderator:
Is this a tongue in cheek comment or real?
Is this a tongue in cheek comment or real?
There will be a LrC 13 in October, 14 in 2024 and so on. Adobe like all the developers are on an upgrade system which happens every fall. You don't have to upgrade using perpetual licence software unless you need RAW support for a new camera or your want the new features. Perpetual license companies never add a major upgrade mid year

The only one I know of that does not do this every fall is DXO PureRaw. March to March. A fluke or by design? I don't know. I know they don't have to offer it at a reduced price like PhotoLab (fall upgrade) because after a month the Black Friday sales start.

In the past Adobe has been criticized of lack lustre upgrades in the fall which is likely people who don't like Adobe. Sometimes they were nothing to write home about but look what just happened. Mid year and I we got Adobe Denoise AI. One of top 3 updates since LrC7 in 2017 IMO.

I can't believe I just wrote 2024. One more year and it will be the 25th anniversary of when the world was supposed to shut down because to save disk space they did not add the entire year like 1999 when this all started. It was 99.
 
I don't know what everyone else thinks but I have been thinking about this today. While I would like a Canon 500 or 600 F4 I doubt I'll ever see one. My wife supports my hobby but it is not her hobby so I draw the line, until we win a lottery.

Not only is Adobe Denoise AI going to save me a few hundred a year in 3rd party NR apps it just may save me tens of thousands in equipment. This may not be a wall mount but for ISO 20000 not too bad for the web. I kept the shutter speed high to capture them when they take off. I'd prefer lower but I was messing around today and was surprised again. I did have good exposure so I didn't have to lift the shadows. Also there is no colour shift using Adobe Denoise.


_G7A6996-Enhanced-NR.jpg


This one was practising levitation :)

_G7A7002-Enhanced-NR.jpg
 
There will be a LrC 13 in October, 14 in 2024 and so on. Adobe like all the developers are on an upgrade system which happens every fall. You don't have to upgrade using perpetual licence software unless you need RAW support for a new camera or your want the new features. Perpetual license companies never add a major upgrade mid year

The only one I know of that does not do this every fall is DXO PureRaw. March to March. A fluke or by design? I don't know. I know they don't have to offer it at a reduced price like PhotoLab (fall upgrade) because after a month the Black Friday sales start.

In the past Adobe has been criticized of lack lustre upgrades in the fall which is likely people who don't like Adobe. Sometimes they were nothing to write home about but look what just happened. Mid year and I we got Adobe Denoise AI. One of top 3 updates since LrC7 in 2017 IMO.

I can't believe I just wrote 2024. One more year and it will be the 25th anniversary of when the world was supposed to shut down because to save disk space they did not add the entire year like 1999 when this all started. It was 99.
Sorry. Sometimes "there was nothing" to write home about.
 
I should also mention the second file was ISO 6400.
 
I upgraded to the RTX 4070 from my old GTX 1070 Ti. I was going to get an RTX4070 Ti but realised when I built this PC that I only put in a 650 Watt PSU.

CPU : AMD Ryzen 9 5900X
Memory : 32GB
GPU : RTX 4070 12GB
O/S Win 10
Denoise : Est 15 secs Actual 20 secs

A big improvement over the old GPU.
I shoot a lot of weddings and deal with very low lighting most of the time. The Denoise AI is a huge help along with the AI masking.
 
Is this a tongue in cheek comment or real?
@BobT

Tongue in cheek, but not entirely.

When Adobe releases the next major version of LrCin October we can expect some big advances in feature sets. I would like to upgrade my 3060Ti but I'm going to hold off until the Adobe announcement. Of course, by then we may have "solid" rumors about the next generation of GPU cards. It's a treadmill, and we all need to know when to step off.

EDIT: I also future-proofed this PC build by putting in a 850W Seasonic Power Supply
 
@BobT

Tongue in cheek, but not entirely.

When Adobe releases the next major version of LrCin October we can expect some big advances in feature sets. I would like to upgrade my 3060Ti but I'm going to hold off until the Adobe announcement. Of course, by then we may have "solid" rumors about the next generation of GPU cards. It's a treadmill, and we all need to know when to step off.

EDIT: I also future-proofed this PC build by putting in a 850W Seasonic Power Supply
Yes, V12.3 brought us AI Denoise. That's huge! Yet there has been not one tweak and we know there are issues with Apple's Neutral Engine and Intel ARCs. I would be very disappointed if Adobe just left it at that. A V12.3.1 cannot be far away.
 
Yes, V12.3 brought us AI Denoise. That's huge! Yet there has been not one tweak and we know there are issues with Apple's Neutral Engine and Intel ARCs. I would be very disappointed if Adobe just left it at that. A V12.3.1 cannot be far away.
I wouldn’t expect a double dot release, those are reserved for huuuuuge workflow-destroying bugs. They’ll undoubtedly continue working on Denoise, but a lot of the issues will be waiting on third parties (e.g. Apple with the neural engine bug, drivers for other graphics cards) so fixes might not be as quick as you hope.
 
@BobT

Tongue in cheek, but not entirely.

When Adobe releases the next major version of LrCin October we can expect some big advances in feature sets. I would like to upgrade my 3060Ti but I'm going to hold off until the Adobe announcement. Of course, by then we may have "solid" rumors about the next generation of GPU cards. It's a treadmill, and we all need to know when to step off.

EDIT: I also future-proofed this PC build by putting in a 850W Seasonic Power Supply
Or to step on...
 
Humor that failed. You wrote: "It's a treadmill, and we all need to know when to step off." I should have written: 'Or know when to step on'.

Some of us got it. Keep trying.


Sent from my iPad using Tapatalk
 
Here are my results with the file from page 20.

CPU: i7 3770k
RAM 16GB
GPU: RTX 2060
WIN10

Est Time: 35
Act Time: 42

My 20mp OM-1 files take 15s so I am find with this GPU for now.
 
I think the stats are really useful. It is a pity there are so few Mac datapoints. I was hoping to see a range of M1 / M2 and M3 stats at some stage.
 
Good idea… I hope that would not be breaking any protocols, etc.. Will give it a go.. probably in the morning…
 
Any data that you get from M1/2 users (such as my own "disappointing" M1 Mac Mini results) are likely to be negatively skewed by the current Apple Neural Engine issue.
 
Any data that you get from M1/2 users (such as my own "disappointing" M1 Mac Mini results) are likely to be negatively skewed by the current Apple Neural Engine issue.
Could well be, but note the fastest time so far is my Apple M2 Max.
 
Any data that you get from M1/2 users (such as my own "disappointing" M1 Mac Mini results) are likely to be negatively skewed by the current Apple Neural Engine issue.
Also, you'll get no response from those with GPU which simply won't run Denoise or run it badly, like the Intel GPUs and some integrated GPUs. However, lack of response is information in itself.
 
Newbie here

I just ran Ai NR in the latest LrC using the file from page 20.

CPU: i9 12900K
RAM 64GB
GPU: GTX 1660 Super
WIN10

Est Time: 1min
Act Time: 1:22

I'm finding the NR quite useful so I'm planning on a GPU upgrade to reduce the batch processing time, thanks for the good work here.
 
Newbie here

I just ran Ai NR in the latest LrC using the file from page 20.

CPU: i9 12900K
RAM 64GB
GPU: GTX 1660 Super
WIN10

Est Time: 1min
Act Time: 1:22

I'm finding the NR quite useful so I'm planning on a GPU upgrade to reduce the batch processing time, thanks for the good work here.
The i9-12900K has the same iGPU as my i7- 12700K. I have no discrete GPU and 32GB of RAM. Denoise runs entirely on the iGPU. It takes 4min 10sec on a 20MB file. I'd be interested to know how long your machine would take to run Denoise just on the iGPU. Only if you're inclined to do so, you should be able to easily select GPUs in Edit>Preferences>Performance. I totally understand if you'd rather not try it.
 
Status
Not open for further replies.
Back
Top