At its heart, the issue really comes down to compression. You can slice the definition of compression any number of ways. As a Chicago native, I'm all about the slice, although it's a dramatically different experience than a slice is for someone who grew up in New York. For me, gimme a couple of slices of sausage and a pitcher of beer from Pizano's on State. For Yogi Berra, a native of St. Louis' Hill who was more concerned about quantity than quality, it was "How many slices should you cut it into? Six. I could never eat eight". For New Yorkers, a slice is just that. You walk in and order a slice, you get a slice--a wedge of cheese pizza on a paper plate, with one option--in a white paper bag (go) or not (stay). You want other stuff on the slice, you call it out up front. A slice is one thing. Anything else is anything else--a slice with stuff, if you will.
Similarly, at some point, the compression argument moves beyond layers 8 and 9 (politics and religion) onto layer 10. Which, unfortunately, is undefined...you can't even call it the stuff layer. If you're familiar with the issues between contradiction and argument (and if you're reading this, you should be, you silly git--Google it if that didn't make sense), you understand that what some people call compression is what others call, well, not so much.
Today, I was wearing orange socks with ducks on them. Shut up. They might've been mallards. They matched my shirt. Shut up! I have a point.
At some juncture, if something walks like a duck, quacks like a duck, and performs other bodily functions like a duck, it's a duck. Therein lies my personal quandry, and likely that of the consumer electronics industry as a whole. Are approaches which "prioritize and throw away less significant video bits in the event of wireless channel issues" any better or worse than solutions which provide "visually lossless" video compression? At its heart, why is there this STUPID discussion about compressed versus uncompressed?
The issue should be, at retail, what will sell in, what will sell through, and what willl stay sold. Instead, we're in this silly jihad about "compression" (in quotes on purpose) rather than what the heck the consumer needs, and whether there's an opportunity for more than one approach.
Work with me as I climb on my soapbox. If you believe that solutions using compression are the right way to go, particularly in light of the advancing signal bandwidth of HDMI from 4.95 gb/s to 10.2 gb/s and beyond, you'll believe that approaches like Tzero's or PulseLink's will be the perfect method. In this case, you'll argue about the merits of a solution which co-exists with the WiMedia standard (Tzero) versus one which doesn't, but has lots more available bandwidth, meaning less need for compression and a potentially better quality video experience (PulseLink). And, you'll argue against solutions like those from Amimon, Radiospire, and Sibeam.
Conversely, if you believe that compressing the video signal adds unwanted shmutz to the video (like putting mushrooms on my slice), you'll be a fan of Amimon, Radiospire, and Sibeam. I've seen demos from every one of the five folks I mention here, and while they're all really promising, the winner may end up being F, none of the above.
If you're spec'ing a solution which needs less than a millisecond of latency, you'll go with Amimon, Radiospire, or Sibeam. If you want the best quality video today, you'll go with Tzero or PulseLink, whose solutions are (to my admittedly non-golden but still reasonably well-trained eyes) the best options available for 1080p/24, but which don't do as well beyond that, since the compression technology is lagging a bit behind the wireless technology. Today, for 1080p/60, if you want something that's going to provide extremely good quality in an in-room fashion at the lowest latency, you'll choose Radiospire, hands down. If you want something that's going to provide good (not great, at least at this point) video quality with the lowest latency with far and away the best range and resilience, you'll choose Amimon. And, if you're content to wait two years while the Wireless HD consortium takes their 60 GHz millimeter wave technology (which looks great and is very low latency, but is the size of a Kegulator) and shrinks it into something that can exist in your house without requiring its own dedicated 20 amp circuit, you'll wait for them.
Ultimately, who's your winner? Well, nobody yet, because winners require products in consumers' homes, and we don't have those yet.
By the way...when we're talking about 1080p/60, bear in mind that there's, uh, precious little content mastered/delivered at 1080p/60 today (particularly over any form of broadcast, be it cable, satellite, or over-the-air), so a lot of this discussion is like figuring out if those are ducks or mallards on my socks. Is it relevant? Today, no, it's not. But my feet are comfy.
So, what's important here? I can do a fancy schmancy SWOT matrix of who wins what, where, why, and how. But, it really comes back to who--who's going to get a product into stores that consumers are going to buy, and be happy enough with performance that 95+% of the products stay sold and don't end up in the reverse supply chain, screwing up margins and numbers for everyone. Only eBay wins there, thanks to their mastery and democratization of the reverse supply chain, but Joe Six-Pack sure doesn't.
Once a number of products have shipped, only then can we predict a winner. And, keeping in mind that different vendors and solutions serve different demographics (e.g., Best Buy vs. Radio Shack), you may see more than one winner.
While we sit here and argue about the merits and shortcomings of "compression", I still can't buy a solution to wirelessly deliver my HDCP-protected HD sources to my HD display, any more than I can figure out whether these socks actually work well with this shirt or not. But at least I'm willing to take the risk and do so.
I challenge the consumer electronics industry to do the same.