Quote:
Originally posted by Kevin C Brown:
[b]>> "DVI is only 8 bit, but HDMI can go up to 12 bits."

> Both are 8-bit for RGB. YCbCr on HDMI can be 8, 10, or 12-bit.

From the Secrets article:
Quote:
If however, you have an HDMI source and a DVI display, the below-black video information may be lost in the translation.
Quote:
Also, DVD data are YCbCr (not RGB), and are converted to RGB in the player for the DVI output. RGB cannot represent all the data in YCbCr, and this is why the below-black information gets truncated.
This also means that if you have an HDMI player and an HDMI display, and actually converted to DVI to use the switching in the 990, and then converted back for the display, you would also lose the below black information.

Quote:
At CEDIA 2004, new DVD players and projectors had HDMI, but no DVI, which means DVI is just about gone after only one year on the market.
CEDIA 2004... [/b]
This is great information, but I'm getting lost in it.

Given that audio standards are not resolved for HDMI, I'll omit any questions on that subject.

If I had a (video) DVI source and DVI display, you're saying it would still be (potentially) inferior to an HDMI source and display?

And, that DVD data (until we see HDDVD or BluRay?) is converted to analog before the DVI output - so is it no "better" than regular RCA connected RGB outs from a DVD?

Where does the old computer VGA connecter fall in this hierarchy? That is, my Infocus 5700 has a VGA input and a DVI input - if I had a computer playing a DVD, would the data also be converted to analog before leaving the computer video card?

If there is no video improvement available from DVDs beyond analog due to the type data,does it make any difference which connections you use (RGB via RCA, VGA, DVI, HDMI)?

Thanks...