Monitor "sharpness": Analog or Digital cable...

shoots

New Member
Joined
Feb 20, 2008
Messages
14
Is there a difference in quality of monitor sharpness or clarity when connecting via a D-sub cable vs. a DVI cable? Meaning...will connecting via a DVI cable yield a "sharper" on screen image by comparison to being connected via a 15 in D-sub analog cable (both ends are D-sub...no converter).

I'm using a 5 month old LG L246WP monitor. It's a S-MVA panel. The digital connection with this monitor is via a HDMI port on the monitor to the DVI port on my stock ATI Radeon 165' graphics card. The supplied cable is a HDMI/DVI cable. Yes...there is no DVI port on the monitor...

I know what your thinking..."swap the cables and see for your self".

Well that is what I did last night after running this monitor via the analog D-sub connection for 5 months (don't ask why:roll:...). But when I switched to the HDMI/DVI cable the screen immediately starting to flicker...badly...and would intermittently 'black out' for 5 seconds stretches before coming back on. Didn't matter what applications, if any, I had running.

So I couldn't really evaluate if my images appeared "sharper" since the black out and flickering was a real problem.

The only changes I made was selecting the input ("HDMI" or "RGB" respectively for digital or analog) via the monitors on-screen-display. No changes occurred within ATI's control center dialog screen. Resolution and refresh rates were identical when using either cable (192' x 12'' @ 6'Hz).

Oh yeah...I did update the latest drivers for my ATI 165'. Didn't change a thing.

I'm really just trying to maximize all that this monitor can offer for my LR/CS3 editing needs. I do not do any gaming or watch videos with this setup. Overall I'm happy with the monitor but when I view the same image file on my LG and then on my girlfriends 23" Cienma Display it just looks noticeable better on her monitor. It just seems "sharper" or more "crisp".

Maybe it's her S-IPS panel vs. my S-MVA that is accounting for the difference but it got me thinking that maybe my analog D-sub connection was a limiting factor...thus the cable switch and subsequent flicker/black out issue...and then this post... Thanks for any insights
 
Joined
Dec 7, 2007
Messages
2,214
Location
Puget Sound
Lightroom Experience
Intermediate
Lightroom Version
Classic
shoots;1'417 said:
Overall I'm happy with the monitor but when I view the same image file on my LG and then on my girlfriends 23" Cienma Display it just looks noticeable better on her monitor. It just seems "sharper" or more "crisp".

Maybe it's her S-IPS panel vs. my S-MVA that is accounting for the difference but it got me thinking that maybe my analog D-sub connection was a limiting factor...thus the cable switch and subsequent flicker/black out issue...and then this post... Thanks for any insights

Its my opinion that the difference in technology would have more impact on the image than the difference between cables, not that cable do not have an impact on image quaility. A good S-IPS panel is quite a treat to the eyes! Sorry I cannot be more specific, but I have only recently switched to a DVI cable, and its on a new monitor and machine at work. For the record, I did not find the new set-up radically different, but we're talking Viewsonic LCD's for office use.

--Ken
 

SiriusDoggy

New Member
Joined
Apr 3, 2008
Messages
16
Location
Las Vegas, NV, USA
Lightroom Experience
Intermediate
Lightroom Version
I have identical dual Samsung 912N monitors on my desktop computer. These monitors only have 15pin Dsub connectors. The Radeon card has a dual head output, one is VGA 15pin DSub and the other is DVI.
I have one monitor attached with a standard 6' VGA cable and the other attached with a 6' VGA to DVI cable.
I can very easily tell the difference between the two on text. The monitor that is hooked up to the DVI output is much sharper and cleaner.
Not sure why since they are both analog signals by the time it reaches the monitor.
 
Top