Back to Mac Usenet

From: "Dolores Park" <noone@nowhere.n
To: All
Subject: Re: ADC->DVI worth doing?
Date:Sat, July 05, 2008 10:36 PM


On Sun, 6 Apr 2008 07:32:15 UTC, dempson@actrix.gen.nz (David Empson) wrote:

> There is a signficant drop in video quality if I use the VGA input. DVI
> (or ADC via the DVI adapter) is much better quality. VGA is noticeably
> noisy and somewhat blurry. DVI is much sharper.

I have a 22inch 16x10 Samsung that's connected to my MacBook via DVI and
to my PC via VGA. At first I was really unhappy with how bad the VGA
input looked. Then, with a page of text covering most of the screen,
I noticed that there were maybe 8 columns of alternating blurriness and
clarity. When I covered the screen with a bitmap of alternating black
and white pixel-sized dots, it became really obvious.

This monitor has some image adjustments that it labels "coarse" and
"fine". Using the coarse setting, going one way produced more & more
alternating columns; going the other way reduced it to two columns,
with the blurriness covering only the right-hand 5th of the screen.
Adjusting the fine setting "just so" eliminated all of the blurriness.

For my purposes, VGA now offers the same clarity as DVI. It appears
that the problem wasn't really with the monitor, per se, just with
the "auto-adjust" circuitry it uses to sync its electronics with the
VGA scan rate. If you have similar adjustments and a suitable test
pattern, you may want to try tweaking it a bit.


31


Running TeleFinder Server v5.7.
© Copyright Spider Island Software