DVI-I is a connector that is compatible with both DVI-A (analog) and DVI-D (digital), which are not compatible. DVI-A is largely a computer format. DVI-D is the consumer variety of DVI, the one that most people consider to be the same as HDMI without the audio (slightly inaccurately); it has found its way into the computer world between certain video cards and displays. DVI w/HDCP is DVI-D that carries the digital copy protection that is found most commonly on HDMI; at the dawn of DVI on consumer-video products, HDCP wasn't always included, sometimes making it incompatible with subsequent products that demanded it. DVI, component, and HDMI are all capable of resolutions higher than 720p.
Edit: I went to private school. I edited for grammar. It's a curse.
okay and thanks for the comments. I read them this way: DVI-I (or DVI-D) is the way to go for home theater and, so long as I don't mind audio not being included (of course not, this is a connection to my screen, which has no audio), there is no reason to have HDMI as it does not add anything. Worse case, an adapter might be available to convert the DVI pins to HDMI pins (with no audio)
there is no reason to have HDMI as it does not add anything. Worse case, an adapter might be available to convert the DVI pins to HDMI pins (with no audio)
Easy tiger! If you are planning on buying anything for your home theatre, make sure that it is fully compatible with HDMI. DVI was a current connection in HT devices for about a year. Thats it. It is still used on some hardware that is not updated yet (like HD cable and satellite boxes), but everything from now on will incorperate HDMI as the primary digital connection. HD DVD and Blu-Ray will use only HDMI as an output of HD signals, and yes, a connector will work to bridge HDMI to DVI, HDMI will be the standard. While it is still in its infantcy now, HDMI will mature and be able to handle everything in one cable that now can take up to 11 individual cables to do (replaces the analog connections of 8 audio channels - SACD, and conponent video).
Aww, he's not so evil. A few cautious higher-end companies happily out of the mainstream still favor DVI for the simple reason that it isn't a moving target like HDMI, which is still evolving and not ready for everything envisioned for it, and because it sometimes fails to do even what it purports to do now. But, leaving aside the matter of audio, HDMI is slightly more versatile than DVI in the realm of component video processing, but not critically. It also supports up to 12 bit processing of signals, whereas DVI taps out at 8 bits, though 12 bits rarely make it to the consumer world, and 10 bits are no guarantee of noticeable superiority. If you find yourself in a DVI context, DVI-D is all that matters for you. DVI-I is something of an anomaly.