• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Stop struggling with Lightroom! There's no need to spend hours hunting for the answers to your Lightroom Classic questions. All the information you need is in Adobe Lightroom Classic - The Missing FAQ!

    To help you get started, there's a series of easy tutorials to guide you through a simple workflow. As you grow in confidence, the book switches to a conversational FAQ format, so you can quickly find answers to advanced questions. And better still, the eBooks are updated for every release, so it's always up to date.
  • 14 August 2024 It's Lightroom update time again! See What’s New in Lightroom Classic 13.5, Mobile & Desktop (August 2024)? for the bug fixes. Hopefully this will fix many of the sync issues reporting in Classic 13.3 and 13.4.

My Recent GPU Experience (Gigabyte Nvidia RTX 4070 Ti 12GB)

Status
Not open for further replies.

Gnits

Senior Member
Joined
Mar 21, 2015
Messages
2,844
Location
Dublin, Ireland.
Lightroom Experience
Power User
Lightroom Version
Classic
Lightroom Version Number
Classic 12.2.1
Operating System
  1. Windows 10
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

"This is a relatively long discussion. It is worth reading the discussion in sequence, as posted."

To assist people returning to this discussion, the following are direct links to frequently used posts.

Post #20 contains a link to a raw file to use, if you wish to compare the performance of your rig to others.
https://www.lightroomqueen.com/comm...e-nvidia-rtx-4070-ti-12gb.47572/#post-1315509

Post #30 contains a summary of Denoise performance stats for a range of GPU’s and CPU’s from readers of this post.
https://www.lightroomqueen.com/comm...ia-rtx-4070-ti-12gb.47572/page-2#post-1315545

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

I am posting this on the Lightroom Classic forum, as I think it is most relevant to Classic users.

With GPU prices softening and the release of the 4xxx range I decided to purchase the Gigabyte Nvidia RTX 4070 Ti 12GB. I was aware, before purchase that I needed a min 750 Watt GPU. I already had a recently purchased Corsair RM850X model and plenty of spare Corsair supplied compatible internal power cables.

The GPU was tricky to install because of its size and the need to install an awkward bracket support system. The GPU was supplied with a Y cable, where the single end was plugged into the GPU and the two other ends plugged into separate power cables (not supplied by Gigabyte) which were then plugged into the PSU. I used Corsair cables, supplied with my PSU.

I triple checked that I had the polarity of all the cable connections correct, square pins fitted into square holes and rounded pins fitted into rounded holes etc. When I powered up my PC….. nothing happened. Nothing. No lights, no fans starting up, no status numbers on the internal motherboard display. Totally dead... and no clue as to what the problem might be. I feared the worst.

I completely disassembled my PC and removed the Corsair power supply. I then followed the Corsair instructions on how to test the PSU. All tests passed and all (28 I think) pin outs had exactly the correct voltage settings.

I rebuilt my rig, this time reinstalling my old GPU. With much relief, it booted perfectly. I was worried that many of my main components, such as motherboard, processor, PSU, etc had been toasted.

I sent a query to Gigabyte, requesting advice and trouble shooting steps, detailing the steps I had taken to test my PSU and my overall rig.

At the same time, I started researching any known issues with this range of GPU’s. I was horrified to discover many incidents where GPU’s, cables and other components had melted following installation of 4xxx GPU’s. My heart sunk. Many of the horror stories pointed to the supplied Y cable as the source. GPU makers pointed towards incorrectly seated motherboards, gpus and or cables. I can confirm that I had triple checked my connections and listened in each case for the ‘click’ as firm connections were made.

I ordered a Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power cable , directly from Corsair. It was been shipped from France and would take a week.

6 days passed and two events occurred on the same day. 1). My Corsair 12VHPWR cable arrived and 2). I received a response from Gigabyte. Essentially, Gigabyte told me to check my GPU (which I had already told Gigabyte I had done) or contact the supplier of the GPU (Amazon)).

So, I installed the Corsair 12VHPWR cable I had just purchased, connecting the GPU to the power supply. There was no need for me to triple check all the connections as the connectors would only connect in the correct way. I gingerly turned on power to the PSU and pressed the start button on the PC….. … the fan on the PSU slowly started to turn …. lights appeared within the GPU, numbers started to appear on the motherboard led display and the boot sequence started.

My PC…came to life… with the sound of the 'Hallelujah Chorus' reverberating in the background.



My Summary.

1. The response from Gigabyte was totally unacceptable.
2. In my opinion, the Y cable supplied by Gigabyte was inappropriate for the intended purpose.
3. Gigabytes response was seriously underwhelming and totally ignored the elephant in the room… ie the Y power cable.

My advice to anyone installing a modern, fairly high powered GPU is to triple check the connections needed to support the GPU and procure a single, fit for purpose, cable assembly which connects the GPU directly to the PSU, without any 3rd party connectors. Make sure you have a modern PSU of sufficient wattage and make sure you have sufficient spare slots in your PSU to cater for twin connections if required to power your GPU.

Ps. I have noticed a very pleasing improvement in Lr responsiveness…... very timely… as the third event which occurred on the same day was the release of a new Lr version, with the new GPU focused noise reduction and mask features. Serendipity ??? ‘Sweet’.

I am posting this story in the hope that others will not have to experience the pain and anguish I have done and help avoid a potential catastrophic meltdown of lots of very expensive components. I know every config is different ... but the bottom line is that modern GPUs are power hungry and need cables fit for purpose.

Here is the cable I purchased directly from Corsair. In my opinion, this should be supplied with GPU's which need dual PSU power slots.

CorsairPsuGPUCable.jpg
 
Last edited by a moderator:
Thanks for the heads up. I may be in a similar situation soon. Might give Gigabyte a miss.
 
I think it was extremely short sighted of Gigabyte in this case, but I am not sure if other GPU oems would be any different. It is better to make sure that you have the correct cable before dismantling or starting such an upgrade.

I was really annoyed that
a. The cable was not included as part of an expensive GPU purchase.
b. No advance warning that I would need such a cable.
c. Gigabytes sub par response.

Save a lot of time and pain by ordering the correct power cable at the same time as the GPU. Query your supplier in advance to check if the correct cable is included.
In my opinion, avoid use of Y cable connectors…. as it is difficult sometimes to align the pins, especially if the cable is a 4 plus 2 style connector. In many cases people may have discarded or misplaced the cables which came with their PSU or PC build, so a cable will need to be ordered anyway.

Based on several reports I read, I was lucky… many have suffered significant damage to expensive components of their PC. I cannot say how representative these reports are, especially as 4xxx range is relatively new. All the more reason to be extra careful with these internal power leads.

For me, it was an extremely bad experience and had no confidence when I got the correct cable that it would solve my problem or that damage had already occurred.

Once I had the correct cable the install was a piece of cake.

One tip I would give is try and keep your existing GPU in your Pc until at least you have the installation complete, as your Pc should boot with your existing card. You can then make sure you have the latest drivers for the new GPU and the ‘go live’ moment should be as simple as plugging your video cable from the old in situ GPU to your new GPU.
 
Last edited:
I bought an ASUS RTX 3080 Ti last year, and it was a bit problematic as well for cables, it needed two PCIE cables, which my 1200w Silverstone did not provide in quantity despite its hefty rating, though I found some from old cables from other supplies (never throw cables out, you never know).

It lacks a brace to keep it from sagging, and that does worry me as it is in a tower. If it were in a case sitting flat it would be fine. I have it tied up with wire ties. I think some brands of the same GPU do have braces.

Effectively we are reaching the point where we have computers sitting in a card inside our computers that are more powerful than the main computer.

I am interesting in, after a bit of time to spend with it, how much you feel the beefy card helped. I was pleased that it helped, but disappointed in how much it helped. Maybe the new AI stuff will change my mind. (I bought the card for other applications mainly, my prior card, an old GTX970, did a decent job with LR/PS).
 
I think the performance of our gpus should not be too different. Maybe it might be worth getting the 12VHPWR cable above anyway, just in case your GPU is throttled due to power. I will report back in a few weeks when I have more hands on experience. When I re-examined the Y cable supplied it did look fairly hefty…. but maybe the cables supplied with my Corsair PSU might have been a lower gauge. I notice the cables are stamped with a PCIe 5 logo.

My trigger for upgrading was returning from my first trip with an a7RV. Lr interactivity just crawled when adjusting sliders, etc. to the extent that I gave up. I definitely notice a major step up in terms of my sense of responsiveness. I do not care how long it takes to import or export or create previews. I just want decent responsiveness when editing individual images, moving from grid to loupe to develop and back… plus normal handling of develop sliders.

One very good reason for installing brackets for the GPU is that due to their size and weight, sudden movements when say handling or transporting your Pc may result in significant bending moments and damage to not only the card but the PCIe slots themselves.
 
Mine actually takes two separate PCIe power connectors, and they recommend not using the ones that are Y's. I just didn't have enough of them with the power supply. I'm good on power, a bit worried on physical support. It really flies with things that use it (Topaz, Starnet, the xxXTerminator products). I have not tried V12 very much yet, need to experiment there. Frankly with doing more astro and less sports am not using LR as much as I used to.
 
Mine actually takes two separate PCIe power connectors, and they recommend not using the ones that are Y's. I just didn't have enough of them with the power supply. I'm good on power, a bit worried on physical support. It really flies with things that use it (Topaz, Starnet, the xxXTerminator products). I have not tried V12 very much yet, need to experiment there. Frankly with doing more astro and less sports am not using LR as much as I used to.
Ah… interesting. I forgot about how slow Topaz De-noise / Sharpen used to be. Looking forward to seeing the difference. With all the advances in masking, etc… I am substantially reducing the number of times I visit Photoshop and or use Plug-ins. On the other hand, looking at some high ISO shots from my a7rv I am disappointed with the level of noise present. Maybe the new DeNoise and GPU combo will come to my rescue. Will report back in due course.
 
The new LR Denoise is rather impressive but it is no longer non-destructive, it produces a DNG (Topaz would produce a TIF). So you still get 4 times the size and some unknown amount of development baked in. But it's fast. Faster than Topaz I think.
 
The new LR Denoise is rather impressive but it is no longer non-destructive, it produces a DNG (Topaz would produce a TIF). So you still get 4 times the size and some unknown amount of development baked in. But it's fast. Faster than Topaz I think.
I guess that being away from the forums for a bit means that you forget some vocabulary.

If the new LR Denoise produces a DNG, isn't that output non-destructive? If it is destructive what does it do and still merit the file type DNG?
If Topaz produces a TIF is that not destructive in some way? If not then my understading of TIF is wrong (quite likely).
 
I guess that being away from the forums for a bit means that you forget some vocabulary.

Evidently it does. Non-destructive doesn't mean what you evidently think it does.

Think of Denoise as like a round trip to PS. The original file remains unchanged (that's the non-destructive part) but you get a file back with baked-in changes. The DNG from Denoise is no longer raw, but really a species of TIFF.
 
Evidently it does. Non-destructive doesn't mean what you evidently think it does.

Think of Denoise as like a round trip to PS. The original file remains unchanged (that's the non-destructive part) but you get a file back with baked-in changes. The DNG from Denoise is no longer raw, but really a species of TIFF.
Hal,

Thanks. I am trying to figure out if there are meaningful differences between the new Adobe Denoise and the existing Topaz products, aside from the extra cost of the latter.
 
Hal,

On further reflection, applying the name DNG to a species of TIFF seems to muddy understanging, not clarify it. Of course Adobe could refine the meaning of DNG.
 
DNG is a fairly comprehensive wrapper. It can contain raw data or could contain fully demosaiced images. DNGs have always been this way. The results from Denoise are in the latter format.

On the other hand, Adobe isn't known for choosing names that facilitate understanding. ;)
 
The Enhanced DNGs produced by LrC are linear DNGs which retain the full raw edit capability, despite being already demosaiced. So unlike Tiffs, these DNGs have access to the full range of raw profiles, WB and more slider latitude.

So referring to them as Tiffs would not be helpful, IMO.
 
I guess that being away from the forums for a bit means that you forget some vocabulary.

If the new LR Denoise produces a DNG, isn't that output non-destructive? If it is destructive what does it do and still merit the file type DNG?
If Topaz produces a TIF is that not destructive in some way? If not then my understading of TIF is wrong (quite likely).

DNG is a file specification like JPG, NEF or TIF from which it it based. As Such it is very extensible. And can contain defined containers of data. If One of those containers is RAW, it contains the same information as the Camera cloud produce un a Proprietary RAW file like an NEF. If the Data is RGB, then the same image rules apply as with JPG, TIFF imports.

Adobe owns the DNG, TIF, PSD and other formats. All of the Lightroom derivative products like HDR, Panorama, Denoise are saved (exported) as derivative files in the DNG file specification. So non destructive always in that the original file remains intact.


Sent from my iPad using Tapatalk
 
DNG is a file specification like JPG, NEF or TIF from which it it based. As Such it is very extensible. And can contain defined containers of data. If One of those containers is RAW, it contains the same information as the Camera cloud produce un a Proprietary RAW file like an NEF. If the Data is RGB, then the same image rules apply as with JPG, TIFF imports.

Adobe owns the DNG, TIF, PSD and other formats. All of the Lightroom derivative products like HDR, Panorama, Denoise are saved (exported) as derivative files in the DNG file specification. So non destructive always in that the original file remains intact.


Sent from my iPad using Tapatalk
See this Alice In Wonderland quote:

https://www.tate.org.uk/tate-etc/is...-use-word-it-means-just-what-i-choose-it-mean

Part of me spent a lot of time when widely accepted terms, e.g. file types, were well understood and carefully defined. I guess that's my problem here.
 
Gesh.... I look away for a brief bit for a bout with Norovirus (or something) and a debate rages on a side comment I should not have made in this context. This was about GPU's, not Enhance/Denoise processing. I'm sorry for the OP for generating a tangent over terminology.
 
No issues with me. I suspect many way be dipping their toes into the 4xxx range of Gpu cards and just wanted to give a heads up re issues I had with a very modern motherboard and Power Supply Unit.

I had forgotten how slow various Topaz tools were on my well specified machine and look forward to trying them again in the near future with my new GPU. And for a change, my timing was great because the day I got my new GPU working , Adobe released the new DeNoise features, so I was glad to read the Denoise discussion.
 
I'm a luddite as far GPUs are concerned. LrC runs beautifully on an i7-124000K with only an integrated GPU. However, Denoise has changed all that. Denoise is rather addictive in that it makes my poor photographic technique look half acceptable, but it's woefully slow on an integrated GPU (~4min). I expect quite a bit of tweaking of Denoise in the weeks to come. However, unless there is a quantum leap in integrated GPU performance, I'm resigned to needing a discrete GPU in the near future regardless. I would much rather put the money towards an i9-13900K but AFAIK that promises only about a 30% improvement in graphics. Looking forward to someone like Puget recommending an optimum GPU for LrC V12.3 but that will be a while away.
 
I originally wrote this as a suggestion that someone donate an image, and those of us who wanted will process it identically and we can compare speeds. The CPU might contribute, but one would assume the GPU is the most intensive aspect. And to confirm, I ran a 62mpix image that was very noisy (from a Sony A7Riv) and it took about 17 seconds with the GPU and turned off GPU and it took... about 17 seconds.

Yes, I restarted LR in between.

I confirmed with task manager that that GPU did a lot of work in that time.

I confirmed it is off in LR.

Is there a different setting?

So anyway I can't tell how much difference GPU makes with my CPU, which removes a significant part of this but....

That aspect aside, I'd be perfectly willing to have others run this same and compare speeds and we could build a table of GPU and time to run. With a big image like this it should take some significant amount of time on slower systems. If anyone cares to partake here is the image I used:

https://drive.google.com/file/d/15UAKmx9IA1k2Zt877ShHeUewCcHquyhu/view?usp=sharing

I did no cropping or other settings, and then did a denoise and timed from hitting Enhance on the dialog box, until the image came back and was fully resolved on the screen (i.e. preview refreshed, not just the spinning wheel stopped).

AMD 3970X with RTX 3080 Ti 17.5 seconds

Anyone care to join?

And anyone know if I can turn off GPU?


GPU.jpg
 
Last edited:
Denoise uses the GPU, irrespective of the setting on the Preferences Performance tab.

On my M1 Mac Mini (16GB), it takes just over 2 minutes (132 seconds).
 
Denoise uses the GPU, irrespective of the setting on the Preferences Performance tab.

On my M1 Mac Mini (16GB), it takes just over 2 minutes (132 seconds).
Being MACIgnorant, is the integrated GPU supported at all by LR?

I wonder why they elected to ignore their own preference setting?
 
With my very lowly-specced graphics card:

Estimated time: 7 minutes
Actual time: 5 minutes 5 seconds

CPU: AMD Ryzen 5900X
GPU: Nvidia GeForce GTX 1050 Ti OC 4GB
RAM: 32 GB 3200Mhz DDR4
OS: Windows 10
 
On my machine (specs in my sig) it took 1:27. LRC estimated one minute.
 
Status
Not open for further replies.
Back
Top