Component (Analog) vs Digital Connections [Archive] - Audio & Video Forums

PDA

View Full Version : Component (Analog) vs Digital Connections



smgemelos
12-02-2005, 03:05 PM
I am trying to figure out if there is a benefit for going digital versus analog to drive an HDTV. I read an article from Blue Jeans Cables, and they made arguements for and against each type.... Component and HDMI/DVI....

Being an electrical engineer, and having worked on high bitrate board design and transmission systems, I have some knowledge on analog and digital transmission, and issues related to the cable and it's impact on cables....

but I still have some opinions and some questions, and I would like ot see how it jives with you all...

We are starting with a digital format - assuming we are using a DVD or HD receiver as the source. For short cable runs the cable noise is minimal (assuming a good quality cable), most of the noise and distortion in introduced each time the signal is converted from digital-to-analog (D-A) or analog to digital (A-D) it is degraded... either electronic noise in the D-A converter, or quantization noise in the A-D converter. So we want to minimize the number of times we make these conversions.

So if we have a digital display - plasma, LCD, or DLP - we should avoid analog completely, avoid adding any additional noise to the signal... what comes out of the MPEG decoder goes straight to the display. It would be foolish to use component cables for a digital display - you are using the D-A converters in the source to create and analog signal, an then the A-D converters in the display to present the image.

If we have a analog display - any tube based display - we sould minimize the number of D-A and A-D conversion.... and there should only be one, converting from digital to analog for presentation on the display. And we should choose the best D-A converter in the system. If the source has the better D-A converter, use that converter and drive the display with component video cables. If the display has the better D-A converters, use HDMI or DVI cables from the source to the display.


This is my understanding of all this from an engineer's point of view.... What's your take on this?

edtyct
12-02-2005, 03:33 PM
You got it. It's basically the D/A and A/D conversions that soften and create some degree of noise on digital displays. If the HDMI/DVI transmission is well executed, theoretically a digital display will have a crisper, sharper look. However, in the real world, the difference isn't always evident, since a bunch of variables affect image quality--deinterlacing and scaling being two important ones, as well as the execution of the HDMI/DVI source and sink. On some digital displays, the digital connection can look softer than component for some reason at 480p.

Techincally, I agree with your last paragraph about analog display. But I'm tempted to go one step further. D/A conversion anywhere along the chain is the culprit, not where it occurs, though I'd be more inclined to say that DVD players are better than TVs at it and that STBs are a tossup. If I had a component-only TV, I'd worry about the lack of HDMI/DVI for copy-protection reasons, not so much for a loss of image quality.

Ed

Woochifer
12-02-2005, 03:40 PM
I think it all boils down to digital video connections already becoming the standard for HD. All new HDTVs and almost all new DVD players now include HDMI connections. Pressure from the studios has already resulted in the upcoming HD-DVD and Blu-ray formats requiring a secure digital video connections and limiting the resolution that can display using analog component video connections. They are now pushing for similar restrictions on all future HD hardware.

The technical arguments about the merits of analog versus digital video are almost meaningless when the studios are primarily concerned about copy protection and piracy. In the next few years, I think you'll see a rapid phase out of analog video connections, and more restrictions on analog resolution with new hardware. If you decide to go with analog component video, then the time to optimize the video setup might be now, because future choices might be restricted to digital video with HD sources with analog component video limited to 480p.

westcott
12-06-2005, 10:27 PM
I agree with Wooch. If you want to future proof your system, HDMI is probably the best choice.

HDMI can handle higher resolutions, on the order of 12 bit RGB.

A good DVD player with the right chip can put out a 10 bit RGB signal via HDMI.

DVI is limited to 8 bit RGB.

So, if anyone tells you DVI and HDMI are the same, keep moving.

By the way, higher bit rates help eliminate contouring artifacts prominent among digital displays. 10 bit RGB signals are enough to put this issue behind you.

evil__betty
12-06-2005, 10:38 PM
Since we're on this topic, I'll re-post a question I had in a different thread:

If HDMI and DVI are strictly digital signals (00110100111101101), is there any benifit to going with a more expensive cable? I fully believe that cables make a world of difference and that you do need good shielding on any analog signal - but digital? Seems kinda like a gold plated USB cable - really no use at all.

Also, what is the signal drop off length for an HDMI cable?

edtyct
12-07-2005, 05:25 AM
More expensive per se won't mean unconditionally better, even if better in a certain context. Length is really the major culprit. DVI supposedly taps out at about 15 ft. HDMI was intended to surpass that figure, though it seems not to hold up much beyond it. Digital signals don't so much gradually decline as die all of a sudden. Long lengths generally benefit from a signal booster. Fiber optic cables are also an option, but they are wildly expensive and not necessarily robust. The more expensive conventional cables may have protection against bending, although sometimes such protection can have adverse effects on uniform impedance, or so I've heard. I did an informal viewing comparison between expensive DVI/DVI-HDMI cables and cables costing about 1/10 the price at about 6 ft. on a 42" digital display (I didn't cut the cables to look at the insides). No difference by eye. Longer lengths in various situations would be necessary for a better sense.

edtyct
12-07-2005, 05:42 AM
HDMI handles YCbCr 4:2:2 formats at up to 12 bits, not necessarily at 12 bits. False contouring used to be a major factor in plasmas and other digital displays that couldn't handle the details of gray scale and color depth. A lot of the early problems have dissipated, though the larger displays will show flaws more conspicuously. For the vast majority of cases DVI video and HDMI video will overlap fairly safely.

One of the problems that sometimes plague (exaggerated term, really) DVI concerned DVD upconversions that used an HD color space (rec 701) rather than a standard color space (rec. 603), but you might not notice if no one pointed it out to you.