Is there a difference in quality of monitor sharpness or clarity when connecting via a D-sub cable vs. a DVI cable? Meaning...will connecting via a DVI cable yield a "sharper" on screen image by comparison to being connected via a 15 in D-sub analog cable (both ends are D-sub...no converter).
I'm using a 5 month old LG L246WP monitor. It's a S-MVA panel. The digital connection with this monitor is via a HDMI port on the monitor to the DVI port on my stock ATI Radeon 165' graphics card. The supplied cable is a HDMI/DVI cable. Yes...there is no DVI port on the monitor...
I know what your thinking..."swap the cables and see for your self".
Well that is what I did last night after running this monitor via the analog D-sub connection for 5 months (don't ask why:roll:...). But when I switched to the HDMI/DVI cable the screen immediately starting to flicker...badly...and would intermittently 'black out' for 5 seconds stretches before coming back on. Didn't matter what applications, if any, I had running.
So I couldn't really evaluate if my images appeared "sharper" since the black out and flickering was a real problem.
The only changes I made was selecting the input ("HDMI" or "RGB" respectively for digital or analog) via the monitors on-screen-display. No changes occurred within ATI's control center dialog screen. Resolution and refresh rates were identical when using either cable (192' x 12'' @ 6'Hz).
Oh yeah...I did update the latest drivers for my ATI 165'. Didn't change a thing.
I'm really just trying to maximize all that this monitor can offer for my LR/CS3 editing needs. I do not do any gaming or watch videos with this setup. Overall I'm happy with the monitor but when I view the same image file on my LG and then on my girlfriends 23" Cienma Display it just looks noticeable better on her monitor. It just seems "sharper" or more "crisp".
Maybe it's her S-IPS panel vs. my S-MVA that is accounting for the difference but it got me thinking that maybe my analog D-sub connection was a limiting factor...thus the cable switch and subsequent flicker/black out issue...and then this post... Thanks for any insights
I'm using a 5 month old LG L246WP monitor. It's a S-MVA panel. The digital connection with this monitor is via a HDMI port on the monitor to the DVI port on my stock ATI Radeon 165' graphics card. The supplied cable is a HDMI/DVI cable. Yes...there is no DVI port on the monitor...
I know what your thinking..."swap the cables and see for your self".
Well that is what I did last night after running this monitor via the analog D-sub connection for 5 months (don't ask why:roll:...). But when I switched to the HDMI/DVI cable the screen immediately starting to flicker...badly...and would intermittently 'black out' for 5 seconds stretches before coming back on. Didn't matter what applications, if any, I had running.
So I couldn't really evaluate if my images appeared "sharper" since the black out and flickering was a real problem.
The only changes I made was selecting the input ("HDMI" or "RGB" respectively for digital or analog) via the monitors on-screen-display. No changes occurred within ATI's control center dialog screen. Resolution and refresh rates were identical when using either cable (192' x 12'' @ 6'Hz).
Oh yeah...I did update the latest drivers for my ATI 165'. Didn't change a thing.
I'm really just trying to maximize all that this monitor can offer for my LR/CS3 editing needs. I do not do any gaming or watch videos with this setup. Overall I'm happy with the monitor but when I view the same image file on my LG and then on my girlfriends 23" Cienma Display it just looks noticeable better on her monitor. It just seems "sharper" or more "crisp".
Maybe it's her S-IPS panel vs. my S-MVA that is accounting for the difference but it got me thinking that maybe my analog D-sub connection was a limiting factor...thus the cable switch and subsequent flicker/black out issue...and then this post... Thanks for any insights