You got it. It's basically the D/A and A/D conversions that soften and create some degree of noise on digital displays. If the HDMI/DVI transmission is well executed, theoretically a digital display will have a crisper, sharper look. However, in the real world, the difference isn't always evident, since a bunch of variables affect image quality--deinterlacing and scaling being two important ones, as well as the execution of the HDMI/DVI source and sink. On some digital displays, the digital connection can look softer than component for some reason at 480p.

Techincally, I agree with your last paragraph about analog display. But I'm tempted to go one step further. D/A conversion anywhere along the chain is the culprit, not where it occurs, though I'd be more inclined to say that DVD players are better than TVs at it and that STBs are a tossup. If I had a component-only TV, I'd worry about the lack of HDMI/DVI for copy-protection reasons, not so much for a loss of image quality.

Ed