Wednesday, December 31, 2008

Coop's 3 Tech Predictions for 2009
#3--Still No Wireless High Definition TV

Roughly 103 weeks ago, I was quoted in the Chicago Tribune that 2007 would be the year of the wireless for HDMI dongle, an external adapter allowing consumers to connect their video sources to their displays sans wires. I further predicted that 2008 would see such technology embedded inside televisions.

Uh, whoops.

I took a page out of Henny Youngman's book ("When I read about the evils of drinking, I gave up reading") by eschewing predictions during CES 2008. But, with apologies to Keith Olbermann, you can't stop me, you can only hope to contain me. Thus, I'm back with a few predictions for the year in tech, 2009. The good news is, I can't possibly do any worse than these folks did in 2008.

Let's start with a topic near and dear to my heart, the wireless transmission of high definition television. For the purposes of this post, we'll refer to the concept as "wireless high-def" so as not to infringe upon or provide undue props to any particular company, organization, or alliance. To further clarify, we're talking about delivering high-def consumer content from a source to a sink, such as a digital set-top box to a flat panel TV; we're not talking about shipping video around a newsroom, delivering video over the Internet, or doing live remotes. Think consumer homes, living rooms, and apple pie.

Five score and three weeks ago, we thought we had it nailed at my former company, Tzero Technologies. We believed that we'd created the first viable solution for cutting the cord on the set-top box, enabling the creation of an entire class of devices and accessories freed from the tether of the audio/video cable. Two years on, I still believe that what we'd created then was absolutely revolutionary.

Unfortunately, revolutionary doesn't always translate into sales success. I'm not picking on Tzero, nor am I picking on Pulse~LINK, Amimon, Radiospire, SiBEAM (all of whose technologies I touched on during last year's CES) or any of the other players trying to make a go of it in this field; they (and others) have their heads down, trying to figure out how to deliver solutions that the market wants.

If this were easy, my 2007 predictions would've held true.

But it isn't easy. Wi-Fi wasn't easy in the early days, when I spent a couple hundred bucks for an access point and another hundred bucks for an 802.11b card, only to learn that $300+ bought me a solution dramatically less reliable and much slower than the 100-foot Ethernet cable I'd kept coiled under the desk. Is Wi-Fi easy now? Well, kind of, depending on what you want out of your Wi-Fi. If you want reasonably reliable data transmission at a bit-error rate which varies based on all kinds of external forces, and don't necessarily need to stream high bitrate, high quality, delay-sensitive traffic (and if you're a vendor who believes you can successfully perform that streaming, I'm happy to field test your gear in my home, where I can see as many as two dozen Wi-Fi access points at any given time), yeah, Wi-Fi is easy.

Plus, not only is Wi-Fi easy, it's becoming more and more ubiquitous. I wouldn't say we've reached the era of Wi-Fi as utility; however, as consumers purchase more connected devices, Wi-Fi has become the lowest common denominator for high-speed connectivity. 3G radios are still too expensive compared to Wi-Fi, plus 3G requires a service provider. Wi-Fi is cheap and cheerful, providing the best bang for the buck in terms of bitrate, size, power consumption, and cost.

But, just as Rome wasn't built in a day, neither was Wi-Fi. The first 802.11b products hit the market in early 2000, after years of the initial 1 and 2 mb/s 802.11 technology; in the nearly nine years since, we've seen advances in throughput (11b-->11g-->11n); advances in security (WEP-->WPA-->WPA2); advances in security setup (curse-->pray-->WPS); advances (?) in messaging (pre-11n, draft 11n, 2.0 but not finished 11n, 11n--we mean it this time); and major reductions in power consumption, package size, and cost.

All of which has taken nine years to play out. Wireless high-def might only be a couple years old, but companies have been trying to deliver consumer wireless TV streaming for years, well pre-dating high-def. Magis Networks closed its doors five years ago, so one can comfortably say that the clock has been running since well before that time. The shift from standard definition to high definition has obviously added dramatically to the complexity of the solution, too.

Where am I going with this, you ask?

Where I'm going is that 2009 isn't going to be the year of the dongle, or of the embedded wireless HDTV--not in the general consumer space, at least. Two major stumbling blocks to success currently exist, neither of which will be resolved in time for the 2009 holiday selling season.

First, for wireless high-def, there are lots of answers, but not really a problem. Sure, I didn't know that I needed Wi-Fi back when I was still tethered to an Ethernet cable, but I'd be hard-pressed to give it up now. For most consumer devices, mobility and flexibility go hand-in-hand. Case in point--I don't believe it would've been possible for laptops to outsell desktops last quarter without embedded Wi-Fi in pretty much every system sold. The portability of a laptop without built-in wireless networking provides limited upside in most consumers' homes. With Wi-Fi built in, consumers are freed from the tether a desktop PC requires, providing considerably more flexibility in just about every aspect of computing.

However, this really doesn't translate into the TV space. How often do you move your TV around? Every few years? Never? Even in rooms other than the living room or main television viewing room, it's doubtful that a TV is moved more than once a year, tops. So, nope, not a problem. What about situations where the connection coming into the house (from the cable, satellite, or IPTV provider) is on the other side of the room from where you want to place the TV. Ah-hah!

Not so much. Yes, this is absolutely a great application of wireless high-def technology. But there's one more issue, which takes me to huge stumbling block number two.


At CES 2007, vendors were quoting prices on products in the $500-$1000 range for a pair of wireless high-def devices (transmitter and receiver). At the time, that really wasn't a huge delta from a high-quality 5-10 meter HDMI cable, which would've run you $200-$300 or more, if you could even find one. Unfortunately, HDMI cable prices have absolutely gone through the floor over the last two years. Right now, a search on 10 meter HDMI cables on delivers a list where the top six results come in at prices ranging from $17 to $58. A search on for the Belkin FlyWire (the only consumer wireless high-def product I'm currently aware of that's theoretically available for pre-order in the U.S. market) delivers a single result, at $1499.

See anything wrong with that math?

Two years ago, the math wasn't quite as disturbing. All the hot new TVs introduced at CES 2007 had price tags that made a $500-$1000 wireless adapter look pricey, but not ridiculous. Then the bottom dropped out of the television market. In a world where I can buy a bundle of two, count 'em two 42" Sharp 1080p LCD panels from for $1499, I'm pretty unlikely to spend that same amount of money on a wireless high-def adapter.

To be sure, vendors are still making announcements. In November, Sony announced a 40" TV with embedded wireless technology, retailing for a cool $5,000. At less than 1 cm thick, yes, I want one.

But not for five grand.

Don't get me wrong. I absolutely believe that at some point, we're going to see TVs with embedded wireless high-def at a price consumers are willing to swallow. A 100% increase in the price of the TV to add wireless? Deal-breaker. A 10% increase in the price of the TV to add wireless? Deal-maker, maybe. The challenge is, if we look at the math of a $1,000 television versus a $1,100 wireless high-def television, that gives us $100 of retail price to play with, or about $30 on the bill of materials (BOM) itself. I find it highly unlikely that any of the wireless high-def vendors will be able to deliver in time for the 2009 holiday selling season a fully functional module at $30 per side. I would love to be wrong about this, but I don't think I will be.

I'll be meeting with most of the players in wireless high-def next week at CES. I look forward to seeing and hearing what's new in terms of why each vendor's solution is better than everyone else's--compressed versus uncompressed, spectrum position, bandwidth used, yadda, yadda, yadda. But, at the end of the day, I'm pretty certain I know most of the technical arguments; in my mind, the only thing that's going to make wireless high-def as easy as Wi-Fi is price. When Joe Six-Pack can walk into his retailer and purchase a wireless high-def TV without having to worry about gigahertz and compression arguments and 48-bit color support, that's when somebody will be able to say they've won the wireless high-def war. With leading analysts like DisplaySearch predicting that year-over-year LCD revenue will drop for the first time ever next year, the need is greater than ever for vendors to differentiate. A high-quality, well-priced wireless high-def option would make a heck of a differentiator. Plus, with the extra-long TV-selling season next year (extending a month beyond the Super Bowl, thanks to the Vancouver Olympics, which run from February 12-28, 2010), vendors should be even further motivated.

Let's face it--no one has yet made a business out of wireless high-def. Lots of money has been invested in some extremely innovative companies, all of whom have shown awesome demos over the past couple of years. I'm not ready to write off any of the companies or their technology approaches, nor am I prepared to say that the winner will be someone just starting out in a garage who hasn't taken her first penny in funding yet. But, what I will say is that without a high-quality, well-priced wireless high-def option that just works, is going to keep selling a lot of HDMI cables.

I'll be back on January 2nd with my 2009 Tech Prediction #2--the Atom Avalanche. Until then, safe travels, and Happy New Year!

Sunday, December 21, 2008

How to Piss Off Your Customers

Delete their profiles and make them re-register "to better serve" them.

My ass.

24 Hour Fitness, you have GOT to be kidding me. I know people that have been tracking their weight and blood pressure for years on your site. And this, THIS is what you deliver to your paying customers in the name of progress?

(As an aside, Gentle Reader, I apologize if you're shocked that I'm writing about a gym. I may not look like it, but I actually do go to the gym. Seriously. Suspend disbelief.)

You appreciate my patience? You give me too much credit. 24 Hour Fitness, c'mon. I understand that you're trying to roll out a new set of capabilities on the website, but I gotta tell ya, at this point I have no belief whatsoever that you can roll out anything that I'll be able to use with confidence--or whose data won't be orphaned again at some point in the future.

This should be Enterprise Architecture 101 meets Customer Relationship Management 101. Migrate the back-end stuff to the new architecture, making customers happier with new functionality integrated with existing data. You already have a data store of people's information mapped to their login ID. Carry it forward to the new system. Maybe the new system is based on a different, more scalable architecture. Maybe the new system can't import the (hopefully encrypted/hashed) passwords. But, is it really that tough to move the existing data store into the new database model? Databases rarely get simpler, so I doubt that you've gotten rid of the basics like login name, password, and member number. Well, since you need them in the new system, why not carry 'em over from the old system? I'll need to create a new password, you say? That's fine--let me click a button to generate a one-time password reset link, sent to my e-mail address on file. Or, let me punch in the membership number on my card to validate my identity, allowing me to create a new password in the new database. But, to just totally whack everybody's account and user data is inexcusable.

Instead of EA 101 meets CRM 101 to deliver a seamlessly elegant solution via an elegantly seamless migration, you've ended up with a train wreck. I guess that's what happens when a company without a CIO tries to roll out a project like this. But, c'mon--one of the CXOs listed on this page had to sign off on this thing. Whether you're the big dog, the marketing guy, or the finance guy, somebody had to know about the fact you were orphaning your user data.

And if you didn't, y'all need to start asking some hard questions of the rest of your team. Pronto.

Saturday, December 20, 2008

My Favorite Device of 2008 is...

...the MSI Wind U100 netbook. I picked up a couple of these on Black Friday to goof around with as sandbox machines. Never in a million years did I expect to buy two, count 'em two laptops in one purchase out of my own pocket. Then again, I never expected to see a well-reviewed laptop for $299.

The Wind isn't perfect. The touchpad is kind of wonky, the screen's WSVGA aspect ratio is a little odd, I'm forever fumbling for the period and slash keys, and the included 3-cell battery's life absolutely SUCKS. But, holy cow, this little machine has so much going for it that I can survive the other stuff--after using the Wind for a few weeks, I'm finding that my 13" MacBook (which is by no means ginormous) feels huge by comparison. The 1024x600 screen isn't all that tall, but it looks awesome--and is more than bright enough, courtesy of its LED backlighting technology. The 1.6 GHz Intel Atom CPU has the guts to power the operating system/s of your choice with cycles to spare.

And, most importantly, it's compact.

Note I'm not saying small. To me, small is the Asus Eee PC in its 7" and 9" versions. The 7" feels like a child's DVD player, while the 9" feels like something one might store recipes on. The difference of an additional inch (actually, 1.1 inches between the 8.9" and 10" screen sizes) of real estate is a big deal. Having synchronized my MacBook with one of my Winds, I now rarely leave the house without throwing the Wind in the car. Sure, Apple fanbois can gush all they want about their iPhones, but when I need a keyboard, a real screen, and a 120 gigabyte hard disk with all my applications and content on it, a phone (be it an iPhone or my BlackBerry) ain't gonna cut it. Maybe in a couple of years, but not today.

The ironic thing about my purchase of the Winds is that I didn't even know they existed less than 72 hours before I bought them. Read my post on the benefits of transparency, and you'll understand what I mean.

I was a pretty big cynic when Intel announced the Atom in March, 2008. The mobile Internet device (MID) class Intel espoused in their initial releases didn't resonate with me; I maintained a healthy dose of skepticism when I attended the Intel Developer Forum in mid-August, and saw a bunch of Atom-based devices that looked like they should be, uh, displaying recipes.

How quickly times change.

The turn of a single season from autumn to winter has delivered devices which are not just functional, but are downright cool. Moreover, they're useful--I've owned some very cool tech over the years that was also pretty worthless, but this new crop of MIDs (in the form of 10" and 12" netbooks) is making a lot of people stand up and take notice, because they're cool, functional, and inexpensive. A lot of my friends and colleagues have talked about how they'd love to get their kids an inexpensive, kid-sized laptop, but could never justify shelling out $500-800 or more to pick up a machine with way too much computing power, way too big a screen, and way too much heft for little Johnny or Janey to carry, drop, and break.

Dads and Moms of the world, you can stop justifying quite so hard. Netbooks are here to stay.

One final note--not everyone loves the netbook class of device. In addition to the potential for customer confusion (e.g, "why is that called a netbook and that's not, even though they're the same size?"), retailers pretty much detest netbooks, at least in this instantiation, and not only because by themselves they're low-margin devices and have introduced so much segmentation confusion. Today's netbooks are very self-contained--consumer goes to store (bricks 'n' mortar or online), buys netbook, takes netbook home (or has it delivered), uses netbook, throws netbook in existing bag or briefcase when on the road.

The problem for the retailers is that there's pretty much nothing else in terms of attach rate (or basket sale, if you prefer) going out the door with the netbook. Some vendors are even shipping their devices with a carrying case, further hurting sales on the high-margin accessories retailers love to sell along with the main device. Case in point--you rarely see an HDTV go out the door on its own. And, if you do, the consumer is typically back at the store in short order to buy a longer HDMI cable or an upconverting DVD player. People typically best understand the concept of attach rate by thinking of gaming consoles; the Xbox 360 currently leads the market with an attach rate of 8.1 games per console. That's huge.

But, today, the attach rate on netbooks is well-nigh zero. Users aren't kitting them out with Bluetooth mice, docking stations, large monitors, They're buying 'em, taking 'em home, and loving 'em. From the retailers I've spoken with, neoprene sleeves seem to be the sole item going in the basket with most netbooks.

Maybe scuba divers should be wary; everyone else, rejoice.

Transparency...What a Great Idea

With employers in such dire straits due to the economy, and with today's lightning-fast pace of news, I'm still amazed by the smoke screen some firms attempt to put forth. In today's world, nothing, nothing is sacred. Heck, if the SIPRNet can suffer a malware invasion, it's unrealistic to expect that company X's layoffs aren't going to show up in the news (or on Valleywag, which passes for news around here).

This week's "announcements" surrounding Apple's involvement in 2009 and future Macworlds are a good point. Apple marches to the beat of a different drummer on so many different levels, it's almost impossible to compare them to anyone else. Apple's PR department plays by a different rulebook than just about any other public company's PR department. But, this week, they blew it.

Check that. I'm not so sure that Apple PR blew it. In fact, I'd more so say that Apple's executive team blew it. In case you missed it, Apple's ending their Macworld participation after the 2009 show. The bigger news is that Steve Jobs isn't going to be delivering the keynote at the 2009 show.

Big deal? Uh, yeah.

The issue isn't that Apple's pulling out of a big trade show that they don't own. The issue isn't even that Steve Jobs isn't going to be giving his customary keynote, leading to rampant speculation about his health. The issue is that Apple stiffed their partner by being utterly silent until the very last moment about this news, then announcing (or "announcing") the changes only when backed into a corner. Tom Krazit's excellent article on the topic is here.

In this day and age, issues like this won't and can't be swept under the rug. If anyone should know this, it's Apple--maintaining secrecy about a product announcement until a keynote speech is one thing. Not having the traditional keynote speaker show up, particularly when that guy's name is Steve Jobs? What, you thought no one would notice? Apple execs, have you no experience with how not to be seen? Seriously?

So, Apple snuffed it this week on transparency. Not the first time, won't be the last. But, let's have a look at transparency where it worked, and worked well.

I've lived in Silicon Valley for 15 years. And, in those 15 years, just about every piece of technology I've bought at a bricks 'n' mortar store, I've bought at Fry's. If you regularly shop at Fry's (and I'm talking real Fry's like Sunnyvale or Palo Alto, not the ones in Dallas or Fishers where the floor personnel are both helpful and awake), you're used to the concept of courteous and efficient self-service. Meaning, finding it yourself is way faster than asking an employee, who is likely to direct you to a section of the store whose goods bear no resemblance to what you actually were inquiring about. I know Fry's Sunnyvale like the back of my hand, and can get around Palo Alto and Campbell with nary a hiccup, too.

As much as I (and pretty much everyone else I know) like to rag on Fry's, we all shop there for a reason. The reason isn't that I like being sent on a wild goose chase, like I'm competing in some form of silly Olympic games. The reason is, they usually have what I want. And, since odds are pretty good I know exactly where the item is after so many years of shopping there, I can usually go right to the aisle, grab the item, pay, get accosted by the door Nazi, then head home to satiate my geek lust with a new piece of kit.

Now, I have a new favorite place to geek out. Micro Center. I've been driving up and down 101 through Santa Clara for 15 years. At some point, maybe eight or nine years ago, the AMC Mercado shopping plaza was built, including a Micro Center. I barely knew what Micro Center was, although I had a pretty good hunch--a place like CompUSA where I knew more about computers and peripherals than the salespeople. Heck, if I wanted that, I'd just go to Fry's.

But I was wrong. A buddy of mine in D.C. always raves about the Micro Center store in Fairfax, VA. Despite his ravings, I saw no need to go into the Micro Center here. I had Fry's, home of courteous and efficient self-service.

What finally took me into the Santa Clara Micro Center? Transparency. Transparency in the form of a Black Friday ad, available on their website a few days before Thanksgiving. Think back to three or four years ago, when the economy wasn't in the tank, and issues like corporate PR were still more than a little translucent. Opaque, even. The first guys who got their hands on Black Friday ads and starting posting them, by my recollection in 2003 or 2004, were absolutely vilified by retailers. I recall at least one retailer threatening to file (or maybe even filing) lawsuits to force the websites to take down the posted ads, lest people learn that the toaster oven from the brand they didn't want was going to be on sale, but only from 5-6 a.m.

Fast forward to 2008. The dismal economic outlook for retailers continues to level the playing field, to the benefit of consumers across the country. For the first time ever, Black Friday sales declined year over year, according to NPD. That said, quite a few folks I've talked to were very pleased with their Black Friday purchases, stating that the advance availability of retailers' Black Friday ads enabled them to make smarter and more targeted buying decisions. True, these decisions resulted in less absolute money spent, contributing to some of that 8% decline. This would probably be impossible to measure, but I'd be interested to know that despite the decline in spending, were consumers happier with what they did get for their money? And, what percentage of the folks who expressed a better than average level of customer satisfaction had managed to do Black Friday research ahead of time?

As someone who falls into the camp who did my research ahead of time, I ended up at Micro Center mid-afternoon on Black Friday. Their online ad had nine different items that grabbed my attention, ranging from flash drives to netbook computers. I didn't know a thing about netbooks when I first saw their ad two days prior to Thanksgiving, but the $299 price pushed me into research mode. What I found was pretty astounding in terms of what this particular netbook could do, so I decided that if Micro Center still had the unit in stock by the time I rolled in there on Black Friday, I'd buy one. After a whole bunch of years ignoring the big Micro Center sign on 101, I made my first trip into the store. What I found wasn't the CompUSA-type store or personnel I expected to find. I also didn't find the vast expanse of everythingness that Fry's offers. Instead, I found salespeople willing to help, merchandise displayed in logical and organized fashion, and a generally all-around better and more manageable shopping experience than I'm used to, all in a store a fraction the size of Fry's.

Am I done with Fry's? Hardly. I've seen too many people swear off Fry's before slinking back in the next week looking for this widget or that doodad. I won't fall into that trap. But, I have found a new venue at which to geek out, one which hopefully will be able to fulfill many of my technical needs. I would've been content to continue driving by Micro Center for the rest of my days, save for the transparency they put forth (and corresponding value I realized) in the days leading up to Black Friday.

For that, you've earned my business. Bravo, Micro Center. Bravo.

Thursday, December 18, 2008

TuneUp for the Mac, + 5 Days...

As you likely noted in my earlier post on TuneUp, I'm excited by the prospect of having a mechanism to clean up my reasonably large and very disorganized iTunes library. After using the program for a day, I purchased the lifetime subscription for $19.95. Is it perfect? No. Is it better than anything else I've found? Yep, absolutely.

Shortcomings include no mechanism to know if you've already cleaned a song. Case in point--I have a ton of compilations (e.g., KFOG's "Live from the Archives" series, lots of soundtracks) which haven't been well-categorized. I started my cleansing by doing a search on the word "various" in iTunes' search box, resulting in more than 3000 hits. Cleaning each of those songs took time and effort, in that sometimes the song would no longer have "various" in its metadata; other times, it'd go to the bottom of the list, resulting in the need to be re-cleaned. Afterwards, starting at the top of the alphabet and working my way down through the As, songs end up being properly classified further down in the alphabet, so that when I reached C, I'd find songs that I know had already been cleaned. And, TuneUp's estimated times remaining often bear as much resemblance to reality as my gas mileage does to what was on the sticker.

(What's that? I have to drive 55 mph for that to work? Oh. Never mind.)

Still, all that is mere kvetching. Four letters are justification alone for you to purchase at least a one-year subscription, if not the lifetime version--SXSW. Austin's most awesome South by Southwest (SXSW) music festival publishes each year's music in a torrent format. The range of quality music available annually is nothing short of breathtaking. However, the vast quantity of music can take your breath away, in terms of trying to figure out what's to your liking. Over the years 2004-2008, I had thousands of SXSW songs in my iTunes collection, classified by nothing other than the album name ("SXSW") and the year. If you don't mind sitting down for the next week or two straight, you might have a shot at hearing a bunch of songs you like, assuming you haven't drifted into delirium after your first night of not sleeping. Plus, as much great music you'll find that you'd likely never otherwise discover, there's quite a bit of music which I'll politely classify as "not to my taste". But, in terms of trying to figure out which of those thousands of songs I truly liked has been really, really tough.

Enter TuneUp. I threw about 500 songs from one year at TuneUp, which did an admirable job of matching about 80% of the songs with album cover art and metadata. The process took a few hours, but running it overnight meant I didn't have to babysit the program. The following night, I threw the remainder of my SXSW songs at TuneUp. While the application crashed overnight, it recovered nicely the next morning, having kept a record of what it had seen, as well as the matches it found. And, even more interesting, if you throw a bunch of songs at TuneUp that it knows it's cleaned previously, it'll absolutely rip through the process--it's when a single or a few songs are intertwined with other songs/albums it hasn't seen before that TuneUp slows down.

But, net-net, TuneUp's a winner. I can now listen to all that SXSW music by genre, rather than taking an entirely random approach. Worth the $19.95 lifetime subscription just for that? You betcha.

Standards--Love 'em or loathe 'em, we need 'em

If you follow consumer electronics, home networking, or the carrier market, you're likely aware of last week's ITU consent vote on G.9960, the first step towards a holistic approach to home networking currently referred to as Dozens of companies from around the globe have contributed input to the process, in an attempt to deliver a worldwide, best-of-breed, next generation solution for home networking using the existing wires in homes--phone line, power line, and coaxial cable.

Each medium has a unique set of capabilities and characteristics. Power line is ubiquitous. Case in point--I don't know anyone who has an HDTV in their living room who also doesn't have power in that room. Flippant? Maybe, but also true. That ubiquity also contributes to the challenges power line home networks face--"dirty lines" due to old copper or poor quality terminations, spiky current, interference, and lots more.

Physical phone lines are present in just about any home which is a candidate for whole-home networking; despite ongoing consumer defection from land lines to mobiles, the physical copper remains in place. However, in many homes, particularly outside the U.S., the physical phone jack terminates in a single room (or two) in the home, making phone line challenging as a mechanism for providing whole-home connectivity.

Coaxial cable faces challenges similar to phone lines, in that coax isn't typically pulled to every room in the home; also, outside of the U.S., coax is relatively uncommon, so what coax brings to the table in terms of information carrying capacity, it lacks in terms of worldwide ubiquity.

In an attempt to overcome these challenges, the ITU has been working for more than two years on the creation of a unified approach to home networking using existing wires, by utilizing best-of-breed technology from each camp. The ITU effort has a long way (9-12 months, maybe more) to go before finalization, and even longer (an additional 6-9 months) before products show up in consumers' homes. But, make no mistake--a worldwide standard to enable manufacturers to develop a single type of silicon, customized with specific profiles for specific physical media, will drive silicon costs down by promoting a multi-vendor ecosystem which can sell to a worldwide market.

I'll be first in line for those products. Today, our home suffers from horrendous interference in the 2.4 GHz band, making streaming of even standard-definition content over 802.11 a non-starter. Our electrical wiring is more than 35 years old; living in a multi-dwelling unit with Nixon-era wiring delivers a less-than-stellar experience for streaming content over power lines. We have one coax jack, which comes into our home through an external wall. Pulling Ethernet throughout our home is a non-starter financially. But, I'm confident that through the use of a series of devices enabling mix and match of physical media, I'd be able to have a cost-effective home network capable of high-definition streaming, high bitrate data, and VOIP, throughout our home. I'm excited, even if it could be two years till these next generation home network devices show up.

I mention all this for two reasons. First, having been involved in a number of the ITU (and HomeGrid Forum, its complementary marketing, certification, and interoperability partner) meetings over the last year, I have an enormous amount of respect for the members from each participating company, who must weigh their technical contributions against existing company agendas, collaborate to reach common ground on technical approaches, and agree on specifications which are both able to be manufactured and represent a leap forward technically.

Second, I began typing this yesterday while sitting in a DLNA face-to-face committee meeting. The amount of minutiae which must be discussed, negotiated, and agreed to so that consumers can have seamless (or relatively so) usage experiences is MIND-BOGGLING, whether we're talking home networking, digital television delivery, whole-home content sharing, seamless cell phone roaming,

I've heard from quite a few folks that they'd love to see standards and industry alliances go by the wayside in favor of the free and open source community defining standards mechanisms. I gotta tell ya, I don't see that happening in a billion years. If you think certain markets are fragmented now (e.g., GSM vs. CDMA, multiple power line standards), I'd wish you an enormous amount of luck getting ANYTHING from multiple vendors to work together in the absence of agreed industry standards. In the last few months, I've spent a lot of time making devices do things they weren't originally designed to do, whether that's loading open source software on routers (DD-WRT on WRT-54G variants), gaming consoles (XBMC on the original Xbox), or netbooks (uh, "other" operating systems on MSI Winds). As fun (okay, I'm a geek--"fun") as these projects have been, they reinforce why so many people and companies spend so much time and effort defining guidelines and standards for interoperability. Having struggled mightily to get devices to speak to one another at the discovery and application layers, I will gladly, gladly pay for off-the-shelf products containing UPnP and DLNA stacks to minimize the challenges in getting devices to both talk to one another, as well as to share content.

So, a shout-out to all of you in the standards and policy communities, whatever you're working on. Thanks, and keep the faith.

Friday, December 12, 2008

Blister Packs and You

Like you, I hate blister packs. HATE 'em. Most consumers believe that blister packs exist solely to annoy you. They don't. Blister packs exist to minimize/mitigate the risk of inventory shrinkage, due to shoplifting or employee theft. And to annoy you.

The University of Florida's excellent annual National Retail Security Survey shows the sobering numbers, so you can understand why product manufacturers are so concerned about creating barriers to shrinkage (and I'm not talking staying out of cold water, Costanza). But, at some point, that barrier leaps from being protective to being overbearing and silly.

Particularly overbearing and silly is receiving an imposingly-packaged product from an online retailer--who should have little to worry about in terms of customers pilfering items from a warehouse, although obviously employee theft is still an issue. Every time I receive a product from Amazon, I'm curious whether I'm going to open the box to find a piece of gear logically packaged for storage and shipment from an online retailer, or whether I'm going to find something that belongs on a retail show floor.

Unfortunately for a young man named Dennis Millarker, he received the latter.

TuneUp for the Mac: First Look

I have a pretty unwieldy music collection--more than 37,000 songs, ripped at random times, on random platforms, using random encoders. I decided last year to wrangle everything into a single iTunes install on a Mac Mini. I chose iTunes because it's easy, and is likely to be supported by Apple for a long time to come. Sure, there are probably better content organizers than iTunes, but I was looking for simplicity in organization; plus, I'm a pretty straightforward user, not an edge case.

That said, I was looking for a reasonable degree of quality; the general consensus is that for VBR MP3 encoding, iTunes pales in comparison to the LAME encoder. I was interested in something that would run on a Mac with a good UI using the LAME encoder, which would also deposit the newly ripped/encoded content into my iTunes library. I found it in Max, a really neat (and free) program from Stephen Booth (sbooth). I'm not an audiophile, so I wasn't interested in lossless encoding, nor in any of the plethora of formats besides MP3. I wanted to stick with MP3, since it's pretty much the universal music format, takes up a fraction of the disk space of lossless approaches, and plays on just about anything. I chose to use VBR because I wanted to take best advantage of VBR's ability to get the most bang for my buck, bit per bit. Learn more here.

Max isn't perfect, but for free, I can't argue too much. Max uses an open source music database called MusicBrainz, which is nowhere near as complete as Gracenote's CDDB, which iTunes uses. Thankfully, sbooth also wrote an AppleScript to auto-fill the Max information fields at encoding time by using iTunes as a proxy to pull from Gracenote. Yes, it's manual, but I gotta sit there and shove the CD in for ripping, so a couple more clicks won't kill me.

I re-ripped the ~600 or so CDs that I own; for the most part, Max + the iTunes script delivered decent data, along with a high-quality audio encode. Sometimes, data would be incomplete or missing altogether. However, for the ~3500 or so CDs I used to own, but have gotten rid of (thanks, Amoeba), I've been stuck with what I have.

And, what I have is a bunch of music with missing/wrong ID tags, missing album art, missing/wrong genres, etc. Since I organized all my music into a single iTunes library last year, I've really wanted to take better advantage of Apple's Front Row, which is a really sweet media center user interface to all the media on a machine. But, with missing album art from thousands of albums, and with inaccurate data for tens of thousands of songs, the experience has been well short of satisfying.

That's why I was so stoked earlier this year to hear about TuneUp, an application designed to clean up all your incomplete/wrongly labeled music in iTunes. However, as punishment to the early adopter crowd, the TuneUp team chose to introduce a Windows version first, making Mac users wait months, MONTHS, for a Mac version. Yeah, I know. Life sucks. Get a helmet.

Today, the helmet came off. TuneUp for the Mac is now available. In somewhat limited use on one of my sandbox machines (not the main iTunes library), I've found TuneUp to be very good at what it claims to do. Sure, there are a few annoyances--the entire clickable area on the 'Save' button doesn't always work, the spinning ball of wait pops up occasionally, the beats per minute field either populates incorrectly or not at all. Despite issues such as these, TuneUp is an extremely useful tool which delivers mostly as promised. Again, I haven't tested this exhaustively. And, judging from comments I saw about problems when Windows users threw huge iTunes libraries at the application, I don't think I'll be asking TuneUp to clean all 37,000 songs at once. But, it's unbelievably cool to command-tab over to iTunes, drag some songs to the TuneUp window, command-tab back over to e-mail, then pop back over a few minutes later to see the results. I highly recommend trying out the free version, which allows cleaning of 500 songs. I'll be spending the $19.95 for the lifetime TuneUp Gold membership soon, probably this weekend. TuneUp is fast, efficient, and unique. Give it a try.