Is it obsolete? I'll say
NO! (Then I'll explain why I'm yelling...)
I recently updated my
HDMI FAQ for HDMI v1.4, shortly before the June publication date was announced. Personally, I find v1.4 more insulting than anything else. Support for 3D and for 2k resolutions is nice, but using either of those will entail replacing my display and my source components before it justifies replacing my surround processor. It also would be helpful for there to be actual standards in the consumer marketplace that can make use of those two, as without those it's not possible to replace my display or sources to make use of technology that would justify the inclusion of v1.4. I'd really rather that they wait to revise the HDMI spec until an industry-adopted consumer 3D spec exists, since past experience with HDMI tells us they'll have to revise it again later to get it right (see HDMI v1.1, v1.2, and v1.3 and lossless audio formats). I've seen some reports suggesting that we may see the first consumer-oriented 3D standard published within the next year or so, which means we could see hardware that supports it at soon as mid- to late-2011 (with an "early adopter" price tag). The bandwidth for 2k is largely a side-effect of planning for 3D, I think, but again we are even farther away from sources offering that sort of resolution in the first place. Displays will likely get there first, but they'll require video scaling of lower-res (1080p and below) sources - which existing HDMI versions can already do just fine - and they'll be rare and very expensive for a good while to come. The rest of the update is (at least in my opinion) pure screw-the-consumer marketing maneuvering.
First, they've revised the "small" HDMI connector that they created in v1.3, before anyone's really had a chance to use the first one. Not only does that hurt manufacturers who might have a use for it, it penalizes any consumers who might have tried to buy into the v1.3 connector. What was wrong with the first one that you had to immediately re-design it? Why not wait and get it right the first time? And if you're going to re-design connectors, why not provide a locking solution for the standard HDMI connector? It's been hands-down the most poorly-designed connector in the marketplace for almost a decade now, it would seem like a better place to focus attention.
Also, integrating Ethernet into HDMI? Why? To truly phase out network cables, we need to replace
every network-enabled device with v1.4 components, and we
still need to connect one of those devices to our network somehow. I'm also unclear on how the network will work, as the press releases to date have not made any effort to provide details. Ethernet is not a daisy-chain architecture. It's a star topography. That means to enable Ethernet across a group of HDMI devices we either need to sacrifice some of the benefits of the star topography (speed and simpler troubleshooting) for something similar to a daisy-chain (anybody out there remember 10base2 networks?) or identify a "hub" HDMI component that will need to incorporate a network switch (increasing cost of that component and requiring the consumer to properly identify that component, which could easily be
either a TV
or a surround receiver depending on the application). All that complexity saves us what, exactly? A couple of ethernet cables? We replace all that expensive gear to get rid of two or three inexpensive cables (as long as you don't buy from Denon) that have reliable locking connectors? We still have to set up networking on each component, and we still need to have some way to get a network connection to the equipment rack. This doesn't make life easier for consumers - it just
creates confusion by "blurring" the wiring when they try to figure out how to make Ethernet work in their system. And manufacturers don't save anything. They still need to provide an Ethernet interface to use this ability, and they need to provide an Ethernet jack to allow for the inevitable situation of a consumer not having a fully v1.4 system. The only winner I see here is HDMI's licensing income.
The automotive bit is obviously unrelated to home theater. I wonder if it couldn't have been handled using the existing v1.3 small connector or a new automotive connector as a "v1.3c" revision.
I've read one source that suggested the only reason they added Ethernet was to differentiate their feature set from
DisplayPort . The same source that suggested the effort to keep some distance on DisplayPort suggested that the automotive aspect of v1.4 was aimed at getting into that sector before IEEE-1394 (AKA Firewire or iLink) does. Those are the only features in the revised spec that are truly new
and can be used without waiting for future video formats to be developed and adopted as standards. If both truly were driven by a desire to compete with other standards rather than provide an actual benefit to the user, I'm not interested in pushing for the standard's adoption in the marketplace. Heck, I could still be quite happy with HDMI v1.1 or v1.2.