DVI Question [Archive] - Audio & Video Forums

PDA

View Full Version : DVI Question



john buckingham
03-19-2004, 06:17 PM
Will D.V.I. give the best picture possible? Thanks in advance.'

John

AVMASTER
03-20-2004, 04:11 AM
[QUOTE=john buckingham]Will D.V.I. give the best picture possible? Thanks in advance.'

to some degree Yes, it depends on the display device;from what i've actually seen so far- projected LCD / DLP and HD plasma seem to benefit from this connection the most

Fallen Kell
03-20-2004, 08:02 AM
DVI will give you the best quality (well for the most part). Basically what makes DVI so good is the fact that the signal stays digital and is never converted to analog until it is finally displayed on the screen (and even then the signal itself keeps its digital features).

Take the following setup, you want to view a DVD in which the DVD player is connected through your receiver's component connections and then through component to your TV. The signal will do the following:
The dvd player will convert the signal to analog to send to the receiver. The receiver will then convert the analog signal back to digital to do the processing on the signal. The receiver will then convert the digital signal back to analog to send to your TV. The TV will then convert the analog signal back to digital to display it to you. Each and every time the signal is converted you potentially lose a lot of quality.

In a DVI connected system you have will probably have the following setup:
DVD player connected to a receiver through either iLink or DVI. And the receiver connected to the TV by DVI. In this situation, the signal is NEVER CONVERTED!!! It stays in its digital form throughout the entire process, thus there is absolutly no data loss from conversions needing to approximate the signal's true value.

Worf101
03-20-2004, 08:23 AM
Sigh, yeah great, another reason to upgrade??!!! When is this latest "can't live without" technology supposed to break out?

woodman
03-20-2004, 06:46 PM
DVI will give you the best quality (well for the most part). Basically what makes DVI so good is the fact that the signal stays digital and is never converted to analog until it is finally displayed on the screen (and even then the signal itself keeps its digital features).

Take the following setup, you want to view a DVD in which the DVD player is connected through your receiver's component connections and then through component to your TV. The signal will do the following:
The dvd player will convert the signal to analog to send to the receiver. The receiver will then convert the analog signal back to digital to do the processing on the signal. The receiver will then convert the digital signal back to analog to send to your TV. The TV will then convert the analog signal back to digital to display it to you. Each and every time the signal is converted you potentially lose a lot of quality.

In a DVI connected system you have will probably have the following setup:
DVD player connected to a receiver through either iLink or DVI. And the receiver connected to the TV by DVI. In this situation, the signal is NEVER CONVERTED!!! It stays in its digital form throughout the entire process, thus there is absolutly no data loss from conversions needing to approximate the signal's true value.

Crikeys fella ... you've got a TON of mis-conceptions here, not the least of which is that DVI was created to give the consumer the best possible performance. Au contraire! DVI was created at the behest of the movie industry to try valiantly to salve their paranoia over the very idea that a consumer would have the audacity to make a copy of any of their precious product even though the purpose may be perfectly benign.

Additionally, your scenarios about all of the back and forth conversions from digital to analog to digital to analog ... on and on, ad infinitum are for the most part totally off base. Sorry, but such paranoid obsessions about potential signal degradation ring hollow to those of us in the profession who know better.

I'm curious to know just what you are referring to when you say "digital features"?

Fallen Kell
03-21-2004, 01:14 PM
Oh I agree that DVI is being used in the TV market to eventually lock in the signal and keep people from being able to record whatever they are watching on TV, but I beg to differ with you about the conversions not degrading the signal.

There is signal loss, but it depends on your source and your system. If you have even one set of DAC's that the signal has to travel through that does not have greater or equal number of bits of resolution, the signal will loss quality. And it does get converted back and forth through the systems. Depending on the A/V receiver, it might just do a plain pass-thru of the signal and not convert it, but it will still get converted once off you DVD player and once back in your TV (this is assuming your are using a HD TV or HD ready TV, but it you have an old analog one, then this won't affect you).

For instance, if the origional signal was produced using 32bit digital cameras (Lucas Arts used this when filming the new Star Wars movies), the origional source is in 32bit resolution for color and brightness. If your DAC's are only 24bit, your reproduced image will be slightly off. Instead of your final image using a possible 4,294,967,296 colors on the screen, it will only be able to show 16,777,216, which is a pretty big difference. This also assumes that your TV can actually represent that many colors on its screen, so if you do not have a TV that can represent it, then to you, there will be no difference in what you see, but compaired to the origional source, there is a difference.

"Digital features" was just a short way of saying whether or not the TV was capabile of using the digital signal. I should have just stated that instead, as it wasn't the best terminology.

The overall image will be determinded by the weakest link in the chain that does any conversion from digital to analog or vice versa.

Geoffcin
03-21-2004, 04:55 PM
Sigh, yeah great, another reason to upgrade??!!! When is this latest "can't live without" technology supposed to break out?

I agree! I wouldn't dump my gear for something with DVI just yet.

I've seen DVI vs. Component and I couldn't see any difference. My guess is that it would make sense for a long run, or in a cable that might need shielding from interference that the DVI would be better, but it's just a guess.