Which provides better image quality YPbPr or DVI? [Archive] - Audio & Video Forums

PDA

View Full Version : Which provides better image quality YPbPr or DVI?



m500
12-15-2005, 06:43 AM
Which provides better image quality YPbPr or DVI? I just had my Moxi DVR installed with HD service and the junk do not have HDMI that I was hoping for instead its got YPbPr and DVI. I like to use the best signal possible to justified my HD subscription, thanks.

HAVIC
12-15-2005, 07:26 AM
DVI is better than HDMI which is better than Component video hook ups. So you want DVI not HDMI.

Sir Terrence the Terrible
12-15-2005, 08:09 AM
Actually DVI and HDMI are pretty much the same when it comes to image quality. DVI only carries video, and HDMI carries video and multchannel audio.

edtyct
12-15-2005, 08:20 AM
m500 and Havic,

DVI is not superior to HDMI in video. In most cases, at this point, they are nearly identical in performance because of limitations in content and other factors that fail to take advantage of HDMI's superior extension. However, any digital connection will have the potential to outperform analog component in certain respects, depending on the source content and the target display. Generally, if the display is digital (not CRT), a digital connection can add a degree of sharpness that may or may not be visible to the naked eye. However, the assumption behind this recommendation is that the signal between the two devices remains in the digital domain until the end. I usually assume that it does, certainly on higher-end equipment, but there is no guarantee. It is difficult to tell without getting the mfger to spill the beans, or simply checking via test screens and/or your eyes. In any event, at this stage in our display evolution, using component video will not necessarily result in performance substantially below DVI and HDMI, and in some cases, the smoothness of analog is preferable to poorly executed digital, which can degrade the moment that it undergoes processing. The sad truth is that you have to check. Sometimes component will roll off the high frequencies a little bit. DVD players that upconvert digitally can add as much as ten lines of resolution to the picture; STBs might also better a digital TV's internal processing, though I wouldn't always bet on it. The bottom line is that analog isn't necessarily a poor relation to digital. If you can afford a DVI cable, or DVI to HDMI adaptor (no need to spend Monster prices for either), experimenting might be worthwhile to see whether digital trumps analog in your system.

Ed

HAVIC
12-15-2005, 09:22 AM
From what I have read DVI has the ability to carry a 1080p signal where HDMI is limited to 1080i. I beleive this to be because of the bandwith used to carry the audio in the hdmi cable. I guess technically at this point both HDMI and DVI would be equal, but when HD DVD players use an upconverting 1080p signal DVI will be the only way to go.

GMichael
12-15-2005, 09:31 AM
Not to jack your thread but, why do projectors (that have no speakers) have HDMI inputs? What's the point? Why not go with DVI?

edtyct
12-15-2005, 09:37 AM
Havic,

At one point, I, and a lot of other people, thought that HDMI as currently constituted would not be as good a fit for 1080p as DVI. I'm not so sure that HDMI won't pass it now, though some sort of signal boost or repeater will certainly be necessary for long runs.

GMichael,

Because HDMI does more than DVI. You can always use another audio connection with it. Plus, HDMI is better-equipped for the video future, especially with projectors that magnify contouring, banding, and granularity errors.

Ed

GMichael
12-15-2005, 10:06 AM
Havic,

At one point, I, and a lot of other people, thought that HDMI as currently constituted would not be as good a fit for 1080p as DVI. I'm not so sure that HDMI won't pass it now, though some sort of signal boost or repeater will certainly be necessary for long runs.

GMichael,

Because HDMI does more than DVI. You can always use another audio connection with it. Plus, HDMI is better-equipped for the video future, especially with projectors that magnify contouring, banding, and granularity errors.

Ed

mmmmmmmmkay. Thanks.

jocko_nc
12-15-2005, 10:11 AM
I am using component video connection for HDTV. For 99% of the content that is out there, I don't think the type of cable connection is the limiting factor. In the future, it may be a bigger deal.

jocko

edtyct
12-15-2005, 10:51 AM
99%? Don't forget that many of us are compulsive--and that doesn't include the completely crazy people. We want to squeeze every possible drop of sharpness and color out of these boxes that we can. We will go to any lengths to grab a fraction of an inch in pixellated real estate. You appear to be one of the sane.

jocko_nc
12-15-2005, 11:03 AM
that funny.........

robert393
12-17-2005, 02:08 PM
m500 and Havic,

.........a digital connection can add a degree of sharpness that may or may not be visible to the naked eye.....using component video will not necessarily result in performance substantially below DVI and HDMI..............experimenting might be worthwhile to see whether digital trumps analog in your system.

Ed Well said Ed. I have component and DVI output on my Dish Satelite HD-Reciever (DVR-942). In addition, I have the same two (2) inputs on my projector. I have both connections (connected) at the same time. :D

I have A/B blind tested several times and I can't visably tell a difference. I have opted to run video DVI just because it is said to be the best, but like I said, I can't see the difference when projected from my JVC Lcos projector onto 133" Draper screen. I would think with a screen that size, I could detect the slightest advantages of one over the other.....but I can't! :confused:

Robert

edtyct
12-18-2005, 06:20 AM
It's a complex point that brings to bear a lot of issues relating to the accuracy of display devices. The largest possible shortcoming of a component feed (apart from copy protection) is that it is subject to noise that impacts the higher frequencies. It can look soft and indistinct compared to a digital signal. However, that's not the end of it. The display's component section can be executed well enough to minimize any loss of detail, and the type of source, viewing distance, and environment (lighting, screen gain, etc.) can minimize it further.

Digital signals are notoriously affected by processing. Anything done to a digital signal to make it brighter, sharper, more intense, etc., within the digital domain will degrade it, effectively lowering the bit rate at which it started. DVI begins at a theoretical 8 bit rate, which is pretty much standard throughout the content-providing industry at this point. This bit rate can allow greyscale anomalies to become visible on current displays with typical contrast ratios, particularly at the lower end in a dark room. But it gets worse. For a display to hold the line at 8 bits requires that its own processing be elaborate (and expensive). DVD players and displays capable of their own 10 bit processing can help to keep the 8 bits of the source (or the DVI limit, whichever is relevant) relatively intact, thereby mollifying any tendency toward false contouring. It's important to realize, however, that 10-bit processing theoretically will not eliminate false contouring on a typical display either; it can reduce it but not necessarily overcome it completely. Furthermore, if artifacts are present on screen, it will not always be possible to identify their primary origins without serious testing. For example, false contouring could be embedded in source material, in a compression scheme, or in a display's native technology. For example, PWM, an ingenious method of creating gradations in grey by varying the length of time that particular pixels on a DLP catch light, can cause contouring due to interruptions in signal continuity.

By the time display devices find their way into our homes, much of the electronics et al. responsible for how well they can operate lies hidden. It may help to get published specs, but the wiggle room for level of actual performance is great. Relying on eyes, our own and/or someone else's, is unavoidable. But then that presents another set of problems. The good news is that, even with different technologies' various strengths and weaknesses, hi def looks awfully good on all of them, and the degree of difference in their abilities to reproduce other kinds of content within certain price points isn't so great that we can't enjoy what we have.

Ed

westcott
12-19-2005, 09:57 AM
edtyct,

Where did I hear this before? (JK)

I am glad that you are helping demystify and educate those who assume dvi and hdmi are the same.

Happy Holidays!

edtyct
12-19-2005, 10:24 AM
Likewise, westcott, and to the rest of the delinquents here. By the way, I don't know whether you were around for this discussion, but I managed to get my hands on Datacolor's Spyder TV colorimeter. I hope to give it a whirl sometime in the next two or three weeks, comparing it to calibration via DVE. (It wouldn't be cricket to compare it with ISF calibration, since it can't go that deep, but for those who have paid for an ISF adjustment, the Spyder may help to minimize any drift. We'll see.) I've heard of people adapting it to their front projection systems. I'll have to learn a little more about the hows and wherefores, but it might be a handy tool beyond its stated boundary.

Ed