Wednesday, December 31, 2008

Coop's 3 Tech Predictions for 2009
#3--Still No Wireless High Definition TV

Roughly 103 weeks ago, I was quoted in the Chicago Tribune that 2007 would be the year of the wireless for HDMI dongle, an external adapter allowing consumers to connect their video sources to their displays sans wires. I further predicted that 2008 would see such technology embedded inside televisions.

Uh, whoops.

I took a page out of Henny Youngman's book ("When I read about the evils of drinking, I gave up reading") by eschewing predictions during CES 2008. But, with apologies to Keith Olbermann, you can't stop me, you can only hope to contain me. Thus, I'm back with a few predictions for the year in tech, 2009. The good news is, I can't possibly do any worse than these folks did in 2008.

Let's start with a topic near and dear to my heart, the wireless transmission of high definition television. For the purposes of this post, we'll refer to the concept as "wireless high-def" so as not to infringe upon or provide undue props to any particular company, organization, or alliance. To further clarify, we're talking about delivering high-def consumer content from a source to a sink, such as a digital set-top box to a flat panel TV; we're not talking about shipping video around a newsroom, delivering video over the Internet, or doing live remotes. Think consumer homes, living rooms, and apple pie.

Five score and three weeks ago, we thought we had it nailed at my former company, Tzero Technologies. We believed that we'd created the first viable solution for cutting the cord on the set-top box, enabling the creation of an entire class of devices and accessories freed from the tether of the audio/video cable. Two years on, I still believe that what we'd created then was absolutely revolutionary.

Unfortunately, revolutionary doesn't always translate into sales success. I'm not picking on Tzero, nor am I picking on Pulse~LINK, Amimon, Radiospire, SiBEAM (all of whose technologies I touched on during last year's CES) or any of the other players trying to make a go of it in this field; they (and others) have their heads down, trying to figure out how to deliver solutions that the market wants.

If this were easy, my 2007 predictions would've held true.

But it isn't easy. Wi-Fi wasn't easy in the early days, when I spent a couple hundred bucks for an access point and another hundred bucks for an 802.11b card, only to learn that $300+ bought me a solution dramatically less reliable and much slower than the 100-foot Ethernet cable I'd kept coiled under the desk. Is Wi-Fi easy now? Well, kind of, depending on what you want out of your Wi-Fi. If you want reasonably reliable data transmission at a bit-error rate which varies based on all kinds of external forces, and don't necessarily need to stream high bitrate, high quality, delay-sensitive traffic (and if you're a vendor who believes you can successfully perform that streaming, I'm happy to field test your gear in my home, where I can see as many as two dozen Wi-Fi access points at any given time), yeah, Wi-Fi is easy.

Plus, not only is Wi-Fi easy, it's becoming more and more ubiquitous. I wouldn't say we've reached the era of Wi-Fi as utility; however, as consumers purchase more connected devices, Wi-Fi has become the lowest common denominator for high-speed connectivity. 3G radios are still too expensive compared to Wi-Fi, plus 3G requires a service provider. Wi-Fi is cheap and cheerful, providing the best bang for the buck in terms of bitrate, size, power consumption, and cost.

But, just as Rome wasn't built in a day, neither was Wi-Fi. The first 802.11b products hit the market in early 2000, after years of the initial 1 and 2 mb/s 802.11 technology; in the nearly nine years since, we've seen advances in throughput (11b-->11g-->11n); advances in security (WEP-->WPA-->WPA2); advances in security setup (curse-->pray-->WPS); advances (?) in messaging (pre-11n, draft 11n, 2.0 but not finished 11n, 11n--we mean it this time); and major reductions in power consumption, package size, and cost.

All of which has taken nine years to play out. Wireless high-def might only be a couple years old, but companies have been trying to deliver consumer wireless TV streaming for years, well pre-dating high-def. Magis Networks closed its doors five years ago, so one can comfortably say that the clock has been running since well before that time. The shift from standard definition to high definition has obviously added dramatically to the complexity of the solution, too.

Where am I going with this, you ask?

Where I'm going is that 2009 isn't going to be the year of the dongle, or of the embedded wireless HDTV--not in the general consumer space, at least. Two major stumbling blocks to success currently exist, neither of which will be resolved in time for the 2009 holiday selling season.

First, for wireless high-def, there are lots of answers, but not really a problem. Sure, I didn't know that I needed Wi-Fi back when I was still tethered to an Ethernet cable, but I'd be hard-pressed to give it up now. For most consumer devices, mobility and flexibility go hand-in-hand. Case in point--I don't believe it would've been possible for laptops to outsell desktops last quarter without embedded Wi-Fi in pretty much every system sold. The portability of a laptop without built-in wireless networking provides limited upside in most consumers' homes. With Wi-Fi built in, consumers are freed from the tether a desktop PC requires, providing considerably more flexibility in just about every aspect of computing.

However, this really doesn't translate into the TV space. How often do you move your TV around? Every few years? Never? Even in rooms other than the living room or main television viewing room, it's doubtful that a TV is moved more than once a year, tops. So, nope, not a problem. What about situations where the connection coming into the house (from the cable, satellite, or IPTV provider) is on the other side of the room from where you want to place the TV. Ah-hah!

Not so much. Yes, this is absolutely a great application of wireless high-def technology. But there's one more issue, which takes me to huge stumbling block number two.

Price.

At CES 2007, vendors were quoting prices on products in the $500-$1000 range for a pair of wireless high-def devices (transmitter and receiver). At the time, that really wasn't a huge delta from a high-quality 5-10 meter HDMI cable, which would've run you $200-$300 or more, if you could even find one. Unfortunately, HDMI cable prices have absolutely gone through the floor over the last two years. Right now, a search on 10 meter HDMI cables on Amazon.com delivers a list where the top six results come in at prices ranging from $17 to $58. A search on Amazon.com for the Belkin FlyWire (the only consumer wireless high-def product I'm currently aware of that's theoretically available for pre-order in the U.S. market) delivers a single result, at $1499.

See anything wrong with that math?

Two years ago, the math wasn't quite as disturbing. All the hot new TVs introduced at CES 2007 had price tags that made a $500-$1000 wireless adapter look pricey, but not ridiculous. Then the bottom dropped out of the television market. In a world where I can buy a bundle of two, count 'em two 42" Sharp 1080p LCD panels from Costco.com for $1499, I'm pretty unlikely to spend that same amount of money on a wireless high-def adapter.

To be sure, vendors are still making announcements. In November, Sony announced a 40" TV with embedded wireless technology, retailing for a cool $5,000. At less than 1 cm thick, yes, I want one.

But not for five grand.

Don't get me wrong. I absolutely believe that at some point, we're going to see TVs with embedded wireless high-def at a price consumers are willing to swallow. A 100% increase in the price of the TV to add wireless? Deal-breaker. A 10% increase in the price of the TV to add wireless? Deal-maker, maybe. The challenge is, if we look at the math of a $1,000 television versus a $1,100 wireless high-def television, that gives us $100 of retail price to play with, or about $30 on the bill of materials (BOM) itself. I find it highly unlikely that any of the wireless high-def vendors will be able to deliver in time for the 2009 holiday selling season a fully functional module at $30 per side. I would love to be wrong about this, but I don't think I will be.

I'll be meeting with most of the players in wireless high-def next week at CES. I look forward to seeing and hearing what's new in terms of why each vendor's solution is better than everyone else's--compressed versus uncompressed, spectrum position, bandwidth used, yadda, yadda, yadda. But, at the end of the day, I'm pretty certain I know most of the technical arguments; in my mind, the only thing that's going to make wireless high-def as easy as Wi-Fi is price. When Joe Six-Pack can walk into his retailer and purchase a wireless high-def TV without having to worry about gigahertz and compression arguments and 48-bit color support, that's when somebody will be able to say they've won the wireless high-def war. With leading analysts like DisplaySearch predicting that year-over-year LCD revenue will drop for the first time ever next year, the need is greater than ever for vendors to differentiate. A high-quality, well-priced wireless high-def option would make a heck of a differentiator. Plus, with the extra-long TV-selling season next year (extending a month beyond the Super Bowl, thanks to the Vancouver Olympics, which run from February 12-28, 2010), vendors should be even further motivated.

Let's face it--no one has yet made a business out of wireless high-def. Lots of money has been invested in some extremely innovative companies, all of whom have shown awesome demos over the past couple of years. I'm not ready to write off any of the companies or their technology approaches, nor am I prepared to say that the winner will be someone just starting out in a garage who hasn't taken her first penny in funding yet. But, what I will say is that without a high-quality, well-priced wireless high-def option that just works, Amazon.com is going to keep selling a lot of HDMI cables.

I'll be back on January 2nd with my 2009 Tech Prediction #2--the Atom Avalanche. Until then, safe travels, and Happy New Year!

Sunday, December 21, 2008

How to Piss Off Your Customers

Delete their profiles and make them re-register "to better serve" them.

My ass.

24 Hour Fitness, you have GOT to be kidding me. I know people that have been tracking their weight and blood pressure for years on your site. And this, THIS is what you deliver to your paying customers in the name of progress?


(As an aside, Gentle Reader, I apologize if you're shocked that I'm writing about a gym. I may not look like it, but I actually do go to the gym. Seriously. Suspend disbelief.)

You appreciate my patience? You give me too much credit. 24 Hour Fitness, c'mon. I understand that you're trying to roll out a new set of capabilities on the website, but I gotta tell ya, at this point I have no belief whatsoever that you can roll out anything that I'll be able to use with confidence--or whose data won't be orphaned again at some point in the future.

This should be Enterprise Architecture 101 meets Customer Relationship Management 101. Migrate the back-end stuff to the new architecture, making customers happier with new functionality integrated with existing data. You already have a data store of people's information mapped to their login ID. Carry it forward to the new system. Maybe the new system is based on a different, more scalable architecture. Maybe the new system can't import the (hopefully encrypted/hashed) passwords. But, is it really that tough to move the existing data store into the new database model? Databases rarely get simpler, so I doubt that you've gotten rid of the basics like login name, password, and member number. Well, since you need them in the new system, why not carry 'em over from the old system? I'll need to create a new password, you say? That's fine--let me click a button to generate a one-time password reset link, sent to my e-mail address on file. Or, let me punch in the membership number on my card to validate my identity, allowing me to create a new password in the new database. But, to just totally whack everybody's account and user data is inexcusable.

Instead of EA 101 meets CRM 101 to deliver a seamlessly elegant solution via an elegantly seamless migration, you've ended up with a train wreck. I guess that's what happens when a company without a CIO tries to roll out a project like this. But, c'mon--one of the CXOs listed on this page had to sign off on this thing. Whether you're the big dog, the marketing guy, or the finance guy, somebody had to know about the fact you were orphaning your user data.

And if you didn't, y'all need to start asking some hard questions of the rest of your team. Pronto.

Saturday, December 20, 2008

My Favorite Device of 2008 is...

...the MSI Wind U100 netbook. I picked up a couple of these on Black Friday to goof around with as sandbox machines. Never in a million years did I expect to buy two, count 'em two laptops in one purchase out of my own pocket. Then again, I never expected to see a well-reviewed laptop for $299.

The Wind isn't perfect. The touchpad is kind of wonky, the screen's WSVGA aspect ratio is a little odd, I'm forever fumbling for the period and slash keys, and the included 3-cell battery's life absolutely SUCKS. But, holy cow, this little machine has so much going for it that I can survive the other stuff--after using the Wind for a few weeks, I'm finding that my 13" MacBook (which is by no means ginormous) feels huge by comparison. The 1024x600 screen isn't all that tall, but it looks awesome--and is more than bright enough, courtesy of its LED backlighting technology. The 1.6 GHz Intel Atom CPU has the guts to power the operating system/s of your choice with cycles to spare.

And, most importantly, it's compact.

Note I'm not saying small. To me, small is the Asus Eee PC in its 7" and 9" versions. The 7" feels like a child's DVD player, while the 9" feels like something one might store recipes on. The difference of an additional inch (actually, 1.1 inches between the 8.9" and 10" screen sizes) of real estate is a big deal. Having synchronized my MacBook with one of my Winds, I now rarely leave the house without throwing the Wind in the car. Sure, Apple fanbois can gush all they want about their iPhones, but when I need a keyboard, a real screen, and a 120 gigabyte hard disk with all my applications and content on it, a phone (be it an iPhone or my BlackBerry) ain't gonna cut it. Maybe in a couple of years, but not today.

The ironic thing about my purchase of the Winds is that I didn't even know they existed less than 72 hours before I bought them. Read my post on the benefits of transparency, and you'll understand what I mean.

I was a pretty big cynic when Intel announced the Atom in March, 2008. The mobile Internet device (MID) class Intel espoused in their initial releases didn't resonate with me; I maintained a healthy dose of skepticism when I attended the Intel Developer Forum in mid-August, and saw a bunch of Atom-based devices that looked like they should be, uh, displaying recipes.

How quickly times change.

The turn of a single season from autumn to winter has delivered devices which are not just functional, but are downright cool. Moreover, they're useful--I've owned some very cool tech over the years that was also pretty worthless, but this new crop of MIDs (in the form of 10" and 12" netbooks) is making a lot of people stand up and take notice, because they're cool, functional, and inexpensive. A lot of my friends and colleagues have talked about how they'd love to get their kids an inexpensive, kid-sized laptop, but could never justify shelling out $500-800 or more to pick up a machine with way too much computing power, way too big a screen, and way too much heft for little Johnny or Janey to carry, drop, and break.

Dads and Moms of the world, you can stop justifying quite so hard. Netbooks are here to stay.

One final note--not everyone loves the netbook class of device. In addition to the potential for customer confusion (e.g, "why is that called a netbook and that's not, even though they're the same size?"), retailers pretty much detest netbooks, at least in this instantiation, and not only because by themselves they're low-margin devices and have introduced so much segmentation confusion. Today's netbooks are very self-contained--consumer goes to store (bricks 'n' mortar or online), buys netbook, takes netbook home (or has it delivered), uses netbook, throws netbook in existing bag or briefcase when on the road.

The problem for the retailers is that there's pretty much nothing else in terms of attach rate (or basket sale, if you prefer) going out the door with the netbook. Some vendors are even shipping their devices with a carrying case, further hurting sales on the high-margin accessories retailers love to sell along with the main device. Case in point--you rarely see an HDTV go out the door on its own. And, if you do, the consumer is typically back at the store in short order to buy a longer HDMI cable or an upconverting DVD player. People typically best understand the concept of attach rate by thinking of gaming consoles; the Xbox 360 currently leads the market with an attach rate of 8.1 games per console. That's huge.

But, today, the attach rate on netbooks is well-nigh zero. Users aren't kitting them out with Bluetooth mice, docking stations, large monitors, et.al. They're buying 'em, taking 'em home, and loving 'em. From the retailers I've spoken with, neoprene sleeves seem to be the sole item going in the basket with most netbooks.

Maybe scuba divers should be wary; everyone else, rejoice.

Transparency...What a Great Idea

With employers in such dire straits due to the economy, and with today's lightning-fast pace of news, I'm still amazed by the smoke screen some firms attempt to put forth. In today's world, nothing, nothing is sacred. Heck, if the SIPRNet can suffer a malware invasion, it's unrealistic to expect that company X's layoffs aren't going to show up in the news (or on Valleywag, which passes for news around here).

This week's "announcements" surrounding Apple's involvement in 2009 and future Macworlds are a good point. Apple marches to the beat of a different drummer on so many different levels, it's almost impossible to compare them to anyone else. Apple's PR department plays by a different rulebook than just about any other public company's PR department. But, this week, they blew it.

Check that. I'm not so sure that Apple PR blew it. In fact, I'd more so say that Apple's executive team blew it. In case you missed it, Apple's ending their Macworld participation after the 2009 show. The bigger news is that Steve Jobs isn't going to be delivering the keynote at the 2009 show.

Big deal? Uh, yeah.

The issue isn't that Apple's pulling out of a big trade show that they don't own. The issue isn't even that Steve Jobs isn't going to be giving his customary keynote, leading to rampant speculation about his health. The issue is that Apple stiffed their partner by being utterly silent until the very last moment about this news, then announcing (or "announcing") the changes only when backed into a corner. Tom Krazit's excellent article on the topic is here.

In this day and age, issues like this won't and can't be swept under the rug. If anyone should know this, it's Apple--maintaining secrecy about a product announcement until a keynote speech is one thing. Not having the traditional keynote speaker show up, particularly when that guy's name is Steve Jobs? What, you thought no one would notice? Apple execs, have you no experience with how not to be seen? Seriously?

So, Apple snuffed it this week on transparency. Not the first time, won't be the last. But, let's have a look at transparency where it worked, and worked well.

I've lived in Silicon Valley for 15 years. And, in those 15 years, just about every piece of technology I've bought at a bricks 'n' mortar store, I've bought at Fry's. If you regularly shop at Fry's (and I'm talking real Fry's like Sunnyvale or Palo Alto, not the ones in Dallas or Fishers where the floor personnel are both helpful and awake), you're used to the concept of courteous and efficient self-service. Meaning, finding it yourself is way faster than asking an employee, who is likely to direct you to a section of the store whose goods bear no resemblance to what you actually were inquiring about. I know Fry's Sunnyvale like the back of my hand, and can get around Palo Alto and Campbell with nary a hiccup, too.

As much as I (and pretty much everyone else I know) like to rag on Fry's, we all shop there for a reason. The reason isn't that I like being sent on a wild goose chase, like I'm competing in some form of silly Olympic games. The reason is, they usually have what I want. And, since odds are pretty good I know exactly where the item is after so many years of shopping there, I can usually go right to the aisle, grab the item, pay, get accosted by the door Nazi, then head home to satiate my geek lust with a new piece of kit.

Now, I have a new favorite place to geek out. Micro Center. I've been driving up and down 101 through Santa Clara for 15 years. At some point, maybe eight or nine years ago, the AMC Mercado shopping plaza was built, including a Micro Center. I barely knew what Micro Center was, although I had a pretty good hunch--a place like CompUSA where I knew more about computers and peripherals than the salespeople. Heck, if I wanted that, I'd just go to Fry's.

But I was wrong. A buddy of mine in D.C. always raves about the Micro Center store in Fairfax, VA. Despite his ravings, I saw no need to go into the Micro Center here. I had Fry's, home of courteous and efficient self-service.

What finally took me into the Santa Clara Micro Center? Transparency. Transparency in the form of a Black Friday ad, available on their website a few days before Thanksgiving. Think back to three or four years ago, when the economy wasn't in the tank, and issues like corporate PR were still more than a little translucent. Opaque, even. The first guys who got their hands on Black Friday ads and starting posting them, by my recollection in 2003 or 2004, were absolutely vilified by retailers. I recall at least one retailer threatening to file (or maybe even filing) lawsuits to force the websites to take down the posted ads, lest people learn that the toaster oven from the brand they didn't want was going to be on sale, but only from 5-6 a.m.

Fast forward to 2008. The dismal economic outlook for retailers continues to level the playing field, to the benefit of consumers across the country. For the first time ever, Black Friday sales declined year over year, according to NPD. That said, quite a few folks I've talked to were very pleased with their Black Friday purchases, stating that the advance availability of retailers' Black Friday ads enabled them to make smarter and more targeted buying decisions. True, these decisions resulted in less absolute money spent, contributing to some of that 8% decline. This would probably be impossible to measure, but I'd be interested to know that despite the decline in spending, were consumers happier with what they did get for their money? And, what percentage of the folks who expressed a better than average level of customer satisfaction had managed to do Black Friday research ahead of time?

As someone who falls into the camp who did my research ahead of time, I ended up at Micro Center mid-afternoon on Black Friday. Their online ad had nine different items that grabbed my attention, ranging from flash drives to netbook computers. I didn't know a thing about netbooks when I first saw their ad two days prior to Thanksgiving, but the $299 price pushed me into research mode. What I found was pretty astounding in terms of what this particular netbook could do, so I decided that if Micro Center still had the unit in stock by the time I rolled in there on Black Friday, I'd buy one. After a whole bunch of years ignoring the big Micro Center sign on 101, I made my first trip into the store. What I found wasn't the CompUSA-type store or personnel I expected to find. I also didn't find the vast expanse of everythingness that Fry's offers. Instead, I found salespeople willing to help, merchandise displayed in logical and organized fashion, and a generally all-around better and more manageable shopping experience than I'm used to, all in a store a fraction the size of Fry's.

Am I done with Fry's? Hardly. I've seen too many people swear off Fry's before slinking back in the next week looking for this widget or that doodad. I won't fall into that trap. But, I have found a new venue at which to geek out, one which hopefully will be able to fulfill many of my technical needs. I would've been content to continue driving by Micro Center for the rest of my days, save for the transparency they put forth (and corresponding value I realized) in the days leading up to Black Friday.

For that, you've earned my business. Bravo, Micro Center. Bravo.

Thursday, December 18, 2008

TuneUp for the Mac, + 5 Days...

As you likely noted in my earlier post on TuneUp, I'm excited by the prospect of having a mechanism to clean up my reasonably large and very disorganized iTunes library. After using the program for a day, I purchased the lifetime subscription for $19.95. Is it perfect? No. Is it better than anything else I've found? Yep, absolutely.

Shortcomings include no mechanism to know if you've already cleaned a song. Case in point--I have a ton of compilations (e.g., KFOG's "Live from the Archives" series, lots of soundtracks) which haven't been well-categorized. I started my cleansing by doing a search on the word "various" in iTunes' search box, resulting in more than 3000 hits. Cleaning each of those songs took time and effort, in that sometimes the song would no longer have "various" in its metadata; other times, it'd go to the bottom of the list, resulting in the need to be re-cleaned. Afterwards, starting at the top of the alphabet and working my way down through the As, songs end up being properly classified further down in the alphabet, so that when I reached C, I'd find songs that I know had already been cleaned. And, TuneUp's estimated times remaining often bear as much resemblance to reality as my gas mileage does to what was on the sticker.

(What's that? I have to drive 55 mph for that to work? Oh. Never mind.)

Still, all that is mere kvetching. Four letters are justification alone for you to purchase at least a one-year subscription, if not the lifetime version--SXSW. Austin's most awesome South by Southwest (SXSW) music festival publishes each year's music in a torrent format. The range of quality music available annually is nothing short of breathtaking. However, the vast quantity of music can take your breath away, in terms of trying to figure out what's to your liking. Over the years 2004-2008, I had thousands of SXSW songs in my iTunes collection, classified by nothing other than the album name ("SXSW") and the year. If you don't mind sitting down for the next week or two straight, you might have a shot at hearing a bunch of songs you like, assuming you haven't drifted into delirium after your first night of not sleeping. Plus, as much great music you'll find that you'd likely never otherwise discover, there's quite a bit of music which I'll politely classify as "not to my taste". But, in terms of trying to figure out which of those thousands of songs I truly liked has been really, really tough.

Enter TuneUp. I threw about 500 songs from one year at TuneUp, which did an admirable job of matching about 80% of the songs with album cover art and metadata. The process took a few hours, but running it overnight meant I didn't have to babysit the program. The following night, I threw the remainder of my SXSW songs at TuneUp. While the application crashed overnight, it recovered nicely the next morning, having kept a record of what it had seen, as well as the matches it found. And, even more interesting, if you throw a bunch of songs at TuneUp that it knows it's cleaned previously, it'll absolutely rip through the process--it's when a single or a few songs are intertwined with other songs/albums it hasn't seen before that TuneUp slows down.

But, net-net, TuneUp's a winner. I can now listen to all that SXSW music by genre, rather than taking an entirely random approach. Worth the $19.95 lifetime subscription just for that? You betcha.

Standards--Love 'em or loathe 'em, we need 'em

If you follow consumer electronics, home networking, or the carrier market, you're likely aware of last week's ITU consent vote on G.9960, the first step towards a holistic approach to home networking currently referred to as G.hn. Dozens of companies from around the globe have contributed input to the G.hn process, in an attempt to deliver a worldwide, best-of-breed, next generation solution for home networking using the existing wires in homes--phone line, power line, and coaxial cable.

Each medium has a unique set of capabilities and characteristics. Power line is ubiquitous. Case in point--I don't know anyone who has an HDTV in their living room who also doesn't have power in that room. Flippant? Maybe, but also true. That ubiquity also contributes to the challenges power line home networks face--"dirty lines" due to old copper or poor quality terminations, spiky current, interference, and lots more.

Physical phone lines are present in just about any home which is a candidate for whole-home networking; despite ongoing consumer defection from land lines to mobiles, the physical copper remains in place. However, in many homes, particularly outside the U.S., the physical phone jack terminates in a single room (or two) in the home, making phone line challenging as a mechanism for providing whole-home connectivity.

Coaxial cable faces challenges similar to phone lines, in that coax isn't typically pulled to every room in the home; also, outside of the U.S., coax is relatively uncommon, so what coax brings to the table in terms of information carrying capacity, it lacks in terms of worldwide ubiquity.

In an attempt to overcome these challenges, the ITU has been working for more than two years on the creation of a unified approach to home networking using existing wires, by utilizing best-of-breed technology from each camp. The ITU effort has a long way (9-12 months, maybe more) to go before finalization, and even longer (an additional 6-9 months) before products show up in consumers' homes. But, make no mistake--a worldwide standard to enable manufacturers to develop a single type of silicon, customized with specific profiles for specific physical media, will drive silicon costs down by promoting a multi-vendor ecosystem which can sell to a worldwide market.

I'll be first in line for those products. Today, our home suffers from horrendous interference in the 2.4 GHz band, making streaming of even standard-definition content over 802.11 a non-starter. Our electrical wiring is more than 35 years old; living in a multi-dwelling unit with Nixon-era wiring delivers a less-than-stellar experience for streaming content over power lines. We have one coax jack, which comes into our home through an external wall. Pulling Ethernet throughout our home is a non-starter financially. But, I'm confident that through the use of a series of devices enabling mix and match of physical media, I'd be able to have a cost-effective home network capable of high-definition streaming, high bitrate data, and VOIP, throughout our home. I'm excited, even if it could be two years till these next generation home network devices show up.

I mention all this for two reasons. First, having been involved in a number of the ITU (and HomeGrid Forum, its complementary marketing, certification, and interoperability partner) meetings over the last year, I have an enormous amount of respect for the members from each participating company, who must weigh their technical contributions against existing company agendas, collaborate to reach common ground on technical approaches, and agree on specifications which are both able to be manufactured and represent a leap forward technically.

Second, I began typing this yesterday while sitting in a DLNA face-to-face committee meeting. The amount of minutiae which must be discussed, negotiated, and agreed to so that consumers can have seamless (or relatively so) usage experiences is MIND-BOGGLING, whether we're talking home networking, digital television delivery, whole-home content sharing, seamless cell phone roaming, et.al.

I've heard from quite a few folks that they'd love to see standards and industry alliances go by the wayside in favor of the free and open source community defining standards mechanisms. I gotta tell ya, I don't see that happening in a billion years. If you think certain markets are fragmented now (e.g., GSM vs. CDMA, multiple power line standards), I'd wish you an enormous amount of luck getting ANYTHING from multiple vendors to work together in the absence of agreed industry standards. In the last few months, I've spent a lot of time making devices do things they weren't originally designed to do, whether that's loading open source software on routers (DD-WRT on WRT-54G variants), gaming consoles (XBMC on the original Xbox), or netbooks (uh, "other" operating systems on MSI Winds). As fun (okay, I'm a geek--"fun") as these projects have been, they reinforce why so many people and companies spend so much time and effort defining guidelines and standards for interoperability. Having struggled mightily to get devices to speak to one another at the discovery and application layers, I will gladly, gladly pay for off-the-shelf products containing UPnP and DLNA stacks to minimize the challenges in getting devices to both talk to one another, as well as to share content.

So, a shout-out to all of you in the standards and policy communities, whatever you're working on. Thanks, and keep the faith.

Friday, December 12, 2008

Blister Packs and You

Like you, I hate blister packs. HATE 'em. Most consumers believe that blister packs exist solely to annoy you. They don't. Blister packs exist to minimize/mitigate the risk of inventory shrinkage, due to shoplifting or employee theft. And to annoy you.

The University of Florida's excellent annual National Retail Security Survey shows the sobering numbers, so you can understand why product manufacturers are so concerned about creating barriers to shrinkage (and I'm not talking staying out of cold water, Costanza). But, at some point, that barrier leaps from being protective to being overbearing and silly.

Particularly overbearing and silly is receiving an imposingly-packaged product from an online retailer--who should have little to worry about in terms of customers pilfering items from a warehouse, although obviously employee theft is still an issue. Every time I receive a product from Amazon, I'm curious whether I'm going to open the box to find a piece of gear logically packaged for storage and shipment from an online retailer, or whether I'm going to find something that belongs on a retail show floor.

Unfortunately for a young man named Dennis Millarker, he received the latter.

TuneUp for the Mac: First Look

I have a pretty unwieldy music collection--more than 37,000 songs, ripped at random times, on random platforms, using random encoders. I decided last year to wrangle everything into a single iTunes install on a Mac Mini. I chose iTunes because it's easy, and is likely to be supported by Apple for a long time to come. Sure, there are probably better content organizers than iTunes, but I was looking for simplicity in organization; plus, I'm a pretty straightforward user, not an edge case.

That said, I was looking for a reasonable degree of quality; the general consensus is that for VBR MP3 encoding, iTunes pales in comparison to the LAME encoder. I was interested in something that would run on a Mac with a good UI using the LAME encoder, which would also deposit the newly ripped/encoded content into my iTunes library. I found it in Max, a really neat (and free) program from Stephen Booth (sbooth). I'm not an audiophile, so I wasn't interested in lossless encoding, nor in any of the plethora of formats besides MP3. I wanted to stick with MP3, since it's pretty much the universal music format, takes up a fraction of the disk space of lossless approaches, and plays on just about anything. I chose to use VBR because I wanted to take best advantage of VBR's ability to get the most bang for my buck, bit per bit. Learn more here.

Max isn't perfect, but for free, I can't argue too much. Max uses an open source music database called MusicBrainz, which is nowhere near as complete as Gracenote's CDDB, which iTunes uses. Thankfully, sbooth also wrote an AppleScript to auto-fill the Max information fields at encoding time by using iTunes as a proxy to pull from Gracenote. Yes, it's manual, but I gotta sit there and shove the CD in for ripping, so a couple more clicks won't kill me.

I re-ripped the ~600 or so CDs that I own; for the most part, Max + the iTunes script delivered decent data, along with a high-quality audio encode. Sometimes, data would be incomplete or missing altogether. However, for the ~3500 or so CDs I used to own, but have gotten rid of (thanks, Amoeba), I've been stuck with what I have.

And, what I have is a bunch of music with missing/wrong ID tags, missing album art, missing/wrong genres, etc. Since I organized all my music into a single iTunes library last year, I've really wanted to take better advantage of Apple's Front Row, which is a really sweet media center user interface to all the media on a machine. But, with missing album art from thousands of albums, and with inaccurate data for tens of thousands of songs, the experience has been well short of satisfying.

That's why I was so stoked earlier this year to hear about TuneUp, an application designed to clean up all your incomplete/wrongly labeled music in iTunes. However, as punishment to the early adopter crowd, the TuneUp team chose to introduce a Windows version first, making Mac users wait months, MONTHS, for a Mac version. Yeah, I know. Life sucks. Get a helmet.

Today, the helmet came off. TuneUp for the Mac is now available. In somewhat limited use on one of my sandbox machines (not the main iTunes library), I've found TuneUp to be very good at what it claims to do. Sure, there are a few annoyances--the entire clickable area on the 'Save' button doesn't always work, the spinning ball of wait pops up occasionally, the beats per minute field either populates incorrectly or not at all. Despite issues such as these, TuneUp is an extremely useful tool which delivers mostly as promised. Again, I haven't tested this exhaustively. And, judging from comments I saw about problems when Windows users threw huge iTunes libraries at the application, I don't think I'll be asking TuneUp to clean all 37,000 songs at once. But, it's unbelievably cool to command-tab over to iTunes, drag some songs to the TuneUp window, command-tab back over to e-mail, then pop back over a few minutes later to see the results. I highly recommend trying out the free version, which allows cleaning of 500 songs. I'll be spending the $19.95 for the lifetime TuneUp Gold membership soon, probably this weekend. TuneUp is fast, efficient, and unique. Give it a try.

Thursday, November 27, 2008

BlackBerry Storm and its Non-Existent Wi-Fi...

Does the lack of Wi-Fi on the new BlackBerry Storm matter? Check out my answer to a LinkedIn question here.

Sunday, October 19, 2008

Is This How Hester Prynne Felt?

I'm cursed. I readily admit it. I mean, when you're a Cubs fan, you learn to take the bad with the good. Or, as is usually the case on the north side, the bad with the lousy, the inept, and the brutal.

Like most Cubs fans, I've grown accustomed to failure. However, I think all of Cubdom thought this year would be different.

And, as usual, we were wrong.

Thinking that this year was going to be the year, I decided to grow a playoff beard. Sure, the playoff beard is a hockey concept, but heck, the Blackhawks hired John McDonough from the Cubs, and he immediately won over the NHL to turn the Friendly Confines into the Frozen Confines on 1/1/09. Why not make a further connection, however tenuous?

So, I stopped shaving on September 20th, the day that the Cubs clinched the division. That day, I was smiling big, because the Cubs were in the playoffs, and I might have an excuse to not shave for as long as six weeks. As I boarded a plane at SFO bound for Tokyo on October 1st, I was bummed that I'd be missing that day's game. But, as I stroked my still-filling-out beard, I took consolation in the fact that the Cubs had their most competitive team in decades, that they matched up well with the Dodgers, and that they had a solid pitching staff and offense which could hopefully offset a bad performance from one or the other.

(As an aside, I found that stroking one's beard prompts a surprisingly contemplative mood, which I'm guessing contributes to the high percentage of renowned bearded philosophers. That, or sheer laziness.)

Again, I was wrong. Wrong. I'm obviously not The Fonz.

When I woke up a couple of hours before landing at Narita, I asked one of the flight attendants if she could ask the flight deck to find out what the Cubs had done. Needless to say, I wasn't happy with the news, but this was a rock-solid ballclub, one which could (and should) bounce back from this temporary setback. Heck, they'd be sleeping in their own beds, then playing in front of their beloved fans, at the most historic park in the National League. The game two effort would obviously be much better, right?

Wrong. Instead, the Cubs turned into the Keystone Kops. Decent Little League teams don't make 4 errors in one game. Unfortunately, when the clock strikes October and the Cubs are still playing, they turn into something much worse than a decent Little League team.

I spent that Friday (Thursday evening U.S. time) at a trade show outside Tokyo. When I bumped into one of my colleagues, he asked what the beard was about. I explained that it was a playoff beard, that I wouldn't be shaving until the Cubs exited the playoffs, and that the Cubs needed every little bit of help they could get. Shortly thereafter, I used another colleague's computer to hop on the web, where I saw that the unraveling had begun.

By the time the final pitch was thrown at Chavez Ravine on Saturday (a rainy Osaka Sunday), I was ready with shaving cream and a razor. For a few minutes, I actually thought of keeping the beard through the end of the World Series, as punishment for being a Cubs fan. My own personal hairshirt, if you will.

That idea lasted about 30 seconds. However, I did leave the mustache intact for Sunday evening. If you've ever seen me with a mustache (and I can all but guarantee you haven't), you'd know that I look like hell with a 'stache; with a face made for radio, the last thing I need to do is call any more attention to myself. Walking around Kyobashi on Sunday night, I kept touching my upper lip, reminding myself that this was my own little form of penance for being a Cubs fan. As soon as I woke up Monday morning, the mustache came off, too. So much for the penance. I really wanted pennants.

When I jumped on the Internet Monday morning Osaka time, I learned that my mustache wasn't the only thing that'd come off--Cubs fans' gloves had, too. Cubdom is well-accustomed to failure. Unlike Daniel Burnham, who crafted Chicago's master plan and made it the beautiful city it is today, Cubs fans have a well-earned tendency to make very little plans, not the big ones of which Burnham spoke.

But, holy cow. Cubs fans were PISSED. It's one thing for your lover to not show up at the church, as has been the case for the Cubs and their fans for the last century. It's an entirely different thing altogether for the lover to be at the altar, just about to utter the sacred vows and slip on the ring, and have that lover go screaming from the church like a frightened kindergartner with his hair on fire. Notwithstanding the Cubs foldo in 1969, Chicago hasn't witnessed a collapse on this scale since October of 1871. I was surprised to see the vitriol not just in the Chicago papers, but spread out across the Internet--coverage from national media outlets spoke volumes about the mindset of the Cubs fan in October, 2008. Hell hath no fury like a lover scorned, and oh, was Cubdom scorned.

Luckily (?), I was spared the ignominy of actually being in the Windy City when the Cubs collapsed. Regardless of my physical location, having my brain still inside my head was self-flagellation enough; I didn't need to be in the U.S. to be annoyed with the team (for their lackluster play), myself (for being a Cubs fan), or my parents (for not dropping me on my head when I was a child, in hopes that I might've rooted for, uh, the Reds or something). I was despondent enough, even though I was more than 6500 miles from Wrigley Field.

By then, the e-mails had started to pour in, expressing condolence over the Cubs' loss (my loss!), and asking how I was going to cope. Well, being so far away from Chicago helped, just as being in Stockholm (4300 miles away from Comiskey Park) helped me cope with the success of that other team a few years ago. That, and scotch. But, I think the toughest part was when my colleague I'd been speaking with in Tokyo saw me in Osaka on Tuesday morning; his first words were "Huh...the Cubs lost?" He wasn't looking for a fight, and he wasn't trying to be a smart-ass--he had no idea that the Cubs had washed out of the series, but by seeing me clean-shaven, he knew. THAT hurt...having a guy who's not a baseball fan recognize, 5500 miles from where we both live (the Bay Area), that all of Cub fandom was in for another long winter. Ugh.

At the time, I hadn't given too much thought to what things were going to be like when I returned home. Certainly, Hurricane Manny and his brethren had decimated Cubdom, but this wasn't a real hurricane. However one might categorize a broken heart, any pieces we as fans needed to pick up were virtual, not physical.

Returning home on October 10th, the League Championship Series was underway in each league. As a baseball fan, I've certainly been watching over the last week, but with no real sense of attachment. I mean, I'm a baseball fan. This is what I do--watch baseball (although I've found it more enjoyable to use the SAP button during the Fox broadcasts...I prefer a language I don't really understand to Tim McCarver).

Theoretically, my own personal season and involvement in baseball in 2008 ended almost two weeks ago in L.A. But, realistically, I never stop being a baseball fan. By the time the holidays roll around each year, I'm already counting the days till pitchers and catchers report--and by "holidays", I mean Halloween, not Christmas. While I expected that this off-season might be a little tougher than every other since I've been old enough to realize I'm cursed, I really hadn't given it any thought.

Until Friday.

I went to the gym to do my regular Friday morning group exercise class. The mix of the 30-35 attendees is mostly women (with a few guys scattered in for good measure), and is probably half American-born, half overseas-born. Generally, not what I'd consider a room of knowledgeable baseball folks--which is why I felt no compunction about wearing my 2007 Cubs playoffs t-shirt.

Now, I'm going to assume it's merely coincidence, but the word "CUBS" is written in dark red capital letters on the shirt. Not precisely scarlet, but close. In typical Cubs fashion, we get four times as many chances to be recognized as that other gal--Hester Prynne only had to carry a single letter. Our burden, fourfold. I'm going to guess that in Chicago, whether your mood as a Cubs fan over the last two weeks has been misery, angst, or anger, at least Cubdom has had a chance to commiserate with each other. Here in the Bay Area, that's a little tougher to do.

So, back to my point. Before, during, and after the gym class, people kept staring at my chest. (Likely the closest I'll ever come to knowing what it feels like to be a woman.) This wasn't the comedic equivalent of wearing "kick me" on my shirt. This was genuine pity. I might as well have been wearing a flashing neon sign saying "sucker", although even then, I'm not sure I would've drawn the same attention as I had from my fellow gymsters, a mix of folks from early 20s to their 70s; big, small, and everywhere in between; natives, migrants, sports fans, and not.

I don't know if I was more surprised that this motley crew of exercisers knew about the Cubs and their collapse; that they seemed to feel pity for me, which I could see in the eyes of each person who read my shirt; or that I ended up being so self-conscious about all those eyes upon me for wearing my own version of a scarlet letter.

Regardless, until now, I've never before been ashamed of being a Cubs fan, and I saw some brutal baseball in the late 80s and early 90s, when I'd be one of only a few hundred fans in the bleachers during rainy, 45-degree mid-April games. But hey, we were Cubs fans. We were there to watch baseball. We didn't expect to see good baseball (from the home team, at least), nor did we have any kind of expectations for our beloved club. Sure, division championships in '84 and '89 gave folks a little bit of hope, but it was more a case that we were happy we'd received an invite to the party, rather than they'd run out of booze after we'd been there awhile.

But man, this sucks. The biggest problem with setting expectations is that you gotta live up to 'em. Tampa went into the season with no expectations, and they're going to the World Series. The Cubs need to start living up to their expectations, and soon.

I don't have another century to spare.

Thursday, October 16, 2008

Hands-on with the Google G1

One of the neat things about living in Silicon Valley is that sometimes mere proximity enables experimentation.

I've been doing a ton of work lately with DD-WRT and Tomato, two versions of open source firmware for consumer networking devices. I'm absolutely geeked about how much cool stuff I can do with these community-developed pieces of software. To think that there's a huge crowd of developers around the world contributing to projects like DD-WRT is pretty awe-inspiring.

Closer to home, equally awe-inspiring is how many innovate tech names, both big and small, are within 15 minutes of me. I'm fortunate that living in Mountain View, I have access to the Google Wi-Fi network which blankets the community. No, it's not my primary connection at home, but it's pretty cool to be able to fire up my MacBook just about anywhere in town and get on the Internet, for free. Just for giggles, I set up a router/AP running DD-WRT to serve as a repeater for the Google Wi-Fi network, to get signal into the house. The standard mechanism to do so has been the use of a Ruckus Wireless access point, but I found that I could do the same thing (and more) with a $30 router instead. Way cool.

So, back to my initial point about living here in Silicon Valley. I wouldn't have been able to do this in a lot of places, but yesterday I managed to get my hands on the Google G1, which ships on the 22nd. Many others have commented on features and limitations of the G1 from a services and connectivity standpoint, so I'll stay away from those topics here. I didn't have nearly enough time to do what I'd consider an exhaustive review, but a few items immediately come to mind, particularly from an ergonomic standpoint.

First, the device itself isn't as big and clunky as I'd expected it to be. Having owned a number of HTC devices (running Windows CE/Mobile), I've found HTC's typical industrial designs to be anything but alluring. And, while I wouldn't call the G1 sexy, I would call it reasonably cool. When the G1 was announced, I read something about this being a 2 1/2 year old HTC hardware design. I'm not sure I'd call the design that dated, but in a world of iPhones and Bolds, I think that the G2 (and other Android-based devices from HTC and other manufacturers) will catch up with the rest of the market soon.

Second, the screen's swing-out-and-up mechanism screams "Andy Rubin" (the man behind Android), in a good way. I was a huge fan of my original Sidekick, despite its shortcomings. Having used a bunch of HTC, Palm, Nokia, and BlackBerry devices, I still believe that the Sidekick (designed by Danger, which was founded by Andy Rubin and Joe Britt) was the best-designed device I've ever used. From the way the G1 screen slides out and up, I'm a little concerned about the long-term durability of the screen-device connector and interface, but only time will tell. After only a few minutes of playing with it, I liked it.

As a number of other reviewers have said, the G1's screen isn't particularly compelling from a visual standpoint. The size is decent, and the touch capabilities are nice, but the resolution is pretty underwhelming--having spent time on a BlackBerry Bold a couple of weeks ago, the G1 pales in comparison. Then again, so does everything else, including the iPhone--the Bold's screen is that good.

Regarding the touch screen, I've found that the trackball on my BlackBerry Curve provides me more than sufficient navigational capability, so much that I never yearn for a touch screen. When I made the move from a Palm device to a BlackBerry a few years back, it probably took me two weeks to fully break the habit of touching the screen. When I moved from my 8700 to my 8320 a year ago, it only took a couple of days to shed the habit of the scroll wheel in favor of the trackball. Net-net, I'm not really a touch screen guy, and don't find myself wanting one.

But, after about two minutes with the G1, I found myself instinctively using the touch screen for icon-based navigation. The G1's trackball is well-located, and readily enables one to use the trackball, touchscreen, or both. After three years or so of not using a touch screen, the design of the G1 had me thumbing away merrily.

Not so merry for me is the design of the keyboard. I'm a little surprised that with only a couple of exceptions (notably Walt Mossberg, as you might expect), other reviewers haven't mentioned this. Any transition to a new device requires some time to get used to; whether it's a new computer, cell phone, or TV remote, anything that necessitates input is going to have a learning curve. However, I'm a little fearful that the design of the G1 is going to result in repetitive stress injuries. Seriously. Typing with the left hand is pretty simple; coming over from a BlackBerry, getting used to key placement would take a little while, but I have no qualms that I'd get used to it. Typing with the right hand is my concern. If you look at the picture, you'll see that the navigation base (my term...I have no idea what it's really called) that's the bottom of the phone when held vertically makes for a big speed bump when held horizontally. Just about every QWERTY mobile device I've ever used has been balanced, with a reasonably even 50/50 weight distribution between hands. Not so with the G1. Because the G1 has the navigation base between your thumb and the keyboard, you end up having to reach waaaaay over to get to the keys, particularly the space bar. I'm a fast, accurate typist on my BlackBerry. And, while I'm certain that I'd get used to the G1's keyboard over time, I fear that I'd end up with a dislocated wrist and a really long thumb--the latter handy for hitchhiking, but not so helpful otherwise.

So, while I'm not going to run out and get a G1 on launch day, I do have a couple of final thoughts.

First, this is really an admirable 1.0 product; congratulations are in order for the Google and HTC teams. Delivering a product this comparatively well-designed at launch is no small effort; hell, look at how long Motorola has been making cell phones, and they still can't come up with a decent user interface.

Second, the fact that this is a six-band phone should be useful for a lot of users. My cell coverage at home on T-Mobile's 1900 MHz band is absolutely abysmal, but my 1700 MHz coverage on their new 3G band (which I've been testing with a Nokia handset) is really solid. Plus, inclusion of the 2100 MHz band makes the phone functional in Japan, which remains an Achilles heel for most U.S. cell phones.

Third, have I mentioned I really don't like the keyboard ergonomics?

Finally, and most importantly, I'm totally stoked about the potential for the worldwide developer community to deliver Android-based applications which pervert (in the best possible sense of the word) the G1 and future Android handsets for the customization and benefit of all. If you'd told me 5 1/2 years ago when I bought my first Linksys WRT54G that in late 2008 I'd be loading the router with newly updated open source firmware, enabling VPNs, VLANs, wireless repeating, Xbox connectivity, and much more, I'd've probably said you were crazy. But, lo and behold, that's what I'm doing. And, with 5 more routers sitting here waiting for experimentation with OpenVPN, wireless DLNA DMS serving, and much more, I'll continue to reap the benefits of the open source movement for years to come.

So too will Android users. A truly open platform enabling innovation from a worldwide community of developers will lead to awesome capabilities, more consumer choice, better carrier competition, and (hopefully) lower monthly phone bills. The G1 is a shot across the bow of the iPhone 3G, the Bold and the Storm, and every other smart (and dumb) phone. No, it's not perfect yet--far from it. But, just as my original WRT54G wasn't perfect, it's gotten a lot better with time. You've come a long way, Baby.

Android has a long way to go, but it'll get there. It's off to a great start.

Monday, September 22, 2008

DLNA + Power Line?

I seem to answer one question per calendar quarter on LinkedIn; today appears to be the day for this quarter's answer. Find the question and the response here.

Friday, September 19, 2008

Is the Bullshit of the Month Award Back?

I haven't given out the Bullshit of the Month award since early this year, as organizations seem to have focused on more important things than bogus press releases and disingenuous comments. You know, items like a global financial meltdown and the pending U.S. election--the former caused by bullshit, the latter a huge producer thereof.

But, I woke up this morning to a story in Crain's Chicago Business that set off the BS meter. In a nutshell, the hullabaloo is that public access channels (PEGs, for Public, Education, and Government) on the AT&T U-Verse IPTV system are grouped under a single program guide channel (99). According to Greg Hinz's article, viewers who want to watch a PEG have to navigate first to channel 99, then to a menu of PEGs. This is a big departure from the way things work on traditional cable systems, where the PEGs show up as standard channels in the program guide, leaving some number of viewers up in arms. A quasi-government group has claimed that for AT&T to make the PEGs look like regular channels, it'd cost $200k per channel. That's not per metropolitan area or headend. That's per channel. And that's mind-boggling.

Now, if you know me, you know that I'd rather watch ESPN Classic Deportes 6 than watch most PEG content. As Hinz notes, satellite providers don't have to carry PEG channels; as a DirecTV subscriber, I remain blissfully unaware of the activities of both my city council and my local high school sports teams. Even if PEGs were available to me, I'd turn them off on my program guide, just as I do other channels that don't interest me. But, back in Chicagoland, The Folks watch my alma mater's football and baseball games on a (Comcast) PEG channel, even if they've been to the game. They'll even TiVo them occasionally, which certainly gives me pause.

Pun intended.

But, I'm having trouble grokking how putting up a PEG channel can cost $200k. Sure, there's the burden of encoders and ingestion and transport and grooming and lots of other shtuff in the IPTV delivery chain. Plus, AT&T's not exactly a small company, so I wouldn't be surprised if lots of those unlocked iPhones floating around China and Russia are being amortized by the TV guys.

The article quotes the executive director of one of affected PEG channels as saying that AT&T could rejigger their system to allow PEG channels to appear inline in the program guide, which is the status quo on traditional (QAM-delivered) Chicago cable systems. On the other hand, the Congressional Research Service issued a report yesterday that makes the $200k per channel claim; extrapolated to the entire Chicago area, that's $40 million alone. Frankly, maybe both sides are right. I don't know how many markets U-Verse has launched in, but at tens of millions of dollars per DMA to add PEGs, that's a dealbreaker. I'm not entirely certain I trust the CRS, due to their almost total lack of transparency, despite being a publicly-funded think tank. These aren't black programs, for Pete's sake. AT&T quotes yesterday's CRS report, but like so many things CRS, the report isn't publicly available; AT&T could do themselves a huge PR favor by making the CRS report available for broad public distribution. Since OpenCRS doesn't have their hands on it yet, I can't actually determine how CRS arrived at this $200k number.

Here's what I can't get my arms around. IPTV is a terribly flexible system...arguably, way more flexible than traditional QAM-delivered cable systems. In theory, IPTV systems can deliver an unlimited number of channels to a user, since the user is simply tuning to a multicast stream, rather than needing access to an entire broadcast program tier over a (let's say) 860 MHz QAM plant, which is restricted by available bandwidth. Cable's migration from analog to digital has allowed a multi-fold increase in terms of the number of channels which can be squashed into this limited number of 6 MHz slots; of course, the evolution from standard definition to high definition has given back some of that benefit, but the world's still a better place now.

The question becomes, is AT&T's architecture not flexible enough to allow additional channels into the program guide? Is the program guide real estate too valuable to allow slots to be cluttered up with PEG channels? I'm sure I'm gonna annoy the PEG crowd with this statement, but I don't see PEG viewers driving the type of CPMs AT&T's ad sales team is looking for, so maybe that's a contributing factor to the $200k per channel number.

Maybe it's not AT&T's fault at all. I left Microsoft's TV group a whole bunch of years ago, so I'm not sure what their platform capabilities are these days, but I guess it's possible that the Microsoft TV platform on which U-Verse is based isn't capable of easily doing add/drop of PEG channels. Heck, who knows. I do know that I've seen DLNA program guide demos which seamlessly integrate broadcast television, video-on-demand, IP video (e.g., YouTube) and in-home content (from a PC or network-attached storage device) into a single user interface, so it's at least possible to prototype, if not actually deliver today.

Maybe this is all the FCC's fault, on multiple levels. I'm sure that the cable guys would love to not be carrying PEG channels today, just as the satellite guys aren't; but, the FCC requires cable to carry PEG, while satellite isn't mandated to--and with no mandate, there's no reason to burn the transponder space. You can argue (fruitlessly) all day about the franchise rights of IPTV carriers vs. cable vs. satellite vs. Verizon FiOS, which is a QAM-IPTV hybrid. And, based on where you stand in terms of PEG carriage, NFL Sunday Ticket or MLB Extra Innings carriage, net neutrality, et.al., you likely have an opinion on must-carry. You may've even expressed that opinion to the FCC. Maybe you even want a la carte pricing--which to me is a crock, but that's a discussion for another time.

Here's another statement that'll come across as heretical to the PEG crowd, but I'm throwing it out there--maybe the time for local must-carry has simply passed, and the FCC needs to throw out the mandate altogether. Would I advocate that? No way--while I might be apathetic about the content available on PEG, I know how passionately lots of folks are when it comes to being able to view their local content. I'm also not naive enough to believe (as do many friends and colleagues here in Silicon Valley) that this type of content should be solely delivered over the Internet. First, the computer is the wrong venue for public access television; having it available on-demand from the Internet is great, but I really think that PEG content is TV content. Second, the audience for this type type of content may not own a computer, have broadband access, or both, but I do think it's a fair trade-off to require public service providers to have to make content available on TV in exchange for local franchise rights. Finally, public interest content is intended to be available to all. The problem is, the FCC mandate requiring cable, but not satellite, to carry PEGs already provides one massive exception, so maybe AT&T is to be congratulated for providing any type of PEG carriage at all, despite the fact that it's not easy to find and impossible to record. I simply don't know.

Heck, maybe I'm not even calling bullshit on AT&T. Maybe I'm just calling bullshit on the Congressional Research Service for that $200k per channel figure. Am I the only one that finds ironic the alternate meaning for their acronym, CRS?

Can't remember shit.

Wednesday, September 10, 2008

Sure, Bandwidth Caps Suck. But…

Outside of perhaps Comcast’s shareholders and employees, I can’t imagine that anyone’s happy about the recently-instituted 250 gigabyte per month bandwidth cap. But, I gotta tell ya, I understand where they’re coming from.

Here’s an analogy for you. Ever been on an airplane? Ever sat next to a “person of size”? Did you wish that person had purchased an extra seat so you weren’t having your space invaded? Or, have you ever had to sit with your legs so tightly jammed against the seat in front of you that you thought you’d die of DVT before landing?

Somewhere along the line, maybe you’ve paid extra to reserve a premium seat, giving you more legroom and/or shoulder room, as well as better service. Last week was a pretty good example. On my United flight from ORD to SFO, I purchased the cheapest possible economy ticket. If I’d been stuck in a standard UA economy seat, I’d’ve been bummin’, as United’s standard seat pitch is fiscally responsible (for them) but physically uncomfortable (for most fliers). Fortunately, over the years, I’ve earned the right to move into a better seat in Economy Plus (E+), providing an extra 4-6 inches of legroom, which makes a massive difference in comfort and productivity. Flying enough BIS (butt-in-seat) miles affords me Economy Plus access whenever I’m on United, but E+ is also a purchase option for those who want a little more legroom but haven’t flown enough to have earned the right to move forward. In this case, paying extra gets you a little more.

The morning of my flight, my upgrade request cleared, enabling me to move up to first class on that particular two-class airplane. If I’d chosen to leave 90 minutes earlier, I’d’ve moved up to business class on a three-class airplane. In either case, some portion of the folks in the forward cabin/s paid more for their tickets to ensure that they’d receive the best seats and service on the airplane. Again, pay more, get more. Other folks paid for upgrades, allowing them to move up for less than the cost of a revenue first/business class ticket. Still others (of whom I was one) burned upgrades earned by flying so many BIS miles. Regardless of how the two dozen of us in that particular first class cabin moved up, we were all up there by choice (no op-ups, with about a dozen opens in the back). And, regardless of what we paid, we all received the benefit of extra leg and shoulder room, as well as a better passenger service ratio.

Frankly, I don’t see the implementation of bandwidth caps being any different.

Airlines have a finite amount of capacity on a given airplane; they need to monetize that capacity in the best way possible. If having two people of size in a three-person row prevents a third person from sitting reasonably in that row’s third seat (leading to the airline’s inability to provide customer #3 the seat and space for which they’ve paid), that’s a problem, which has led most airlines to create policies governing people of size. Anyone, regardless of size, can choose to purchase a ticket providing additional room on most airlines; on airlines without premium cabins (e.g., Southwest), you can purchase two tickets to ensure sufficient space. Because, after all, that’s what airlines are really selling--space to get you from point A to point B. You can pay more, you can pay less, you can redeem miles for different classes of service. Options abound. But, net-net, you’re buying space to take you from one location to another.

Well, guess what? Just like an airplane only has so many seats, service provider pipes only have so much capacity--and spare me any discussion about DWDM and lambdas and such. Service providers are trying to run a business, too. Yes, we’re all yearning for more bandwidth to do whatever it is we do on the Internet. And, we’ve now reached the point that we’re gonna have to start paying for that bandwidth, in fashion little different from classes of service on an airplane.

Am I sticking up for Comcast? Hardly. But, a monthly 250 GB cap is a pretty hefty amount of traffic for today’s Internet use. Using up 8 gigabytes a day is a whole lot of Internet surfing and e-mail.

What’s that, you say? People are doing more than just web surfing and e-mailing on the Internet these days?

Uh, right.

Here's the challenge--Comcast has stated that today, fewer than 1% of their user base approaches the 250 GB per month number. My parents are unlikely to hit that number anytime soon, maybe ever. Me? Heck, I could hit that number next month if I wanted to, but I don't currently have a compelling application to do so. I mean, I'd have to keep my AT&T ADSL line darn-near saturated; at the average speeds I get (1.2 mb/s down, 300 kb/s up), I might end up melting the copper to get to 250 GB in a month, but I could do it.

The question is, how? Well, I certainly know of people who've dedicated one of their home PCs to do nothing but run a torrent client 24 hours a day, 365 days a year. So, yes, they're the people who're in that 1%. Some/many of those 1% are downloading movies encoded in high definition video formats; I recently watched an HD movie being served from a network-attached storage device over a wireless home network at a colleague's house, and I gotta tell ya, it was awesome--serious geek envy, particularly since the double-whammy of living in a Faraday Cage in RF-saturated Mountain View makes even basic 802.11 connectivity at our home less than reliable. If my colleague would have downloaded that 8 gigabyte file over an Internet connection, he would've wiped out his day's Comcast Internet allotment with a single movie.

Which sucks.

But, again, I understand that Comcast is trying to balance a given amount of resource for a given price. Not to use the old parental adage of one bad apple spoiling the bunch, but the 99% of folks who aren't using 250 GB of traffic aren't the issue here.

Now, lest you think I've lost my mind and am defending the implementation of bandwidth caps, here are a couple of other thoughts.

First, a 250 GB cap is actually a pretty darned good number to start with. Time Warner is testing a 40 GB cap on their high-end ($55/month) offering. If you think you're gonna hate 250 GB per month, you'd really hate 15% of that. You'd also hate the much-lower-than-250 GB limit from Rogers, Bell Canada, Frontier, and lots of other providers who've implemented caps.

So, in my mind, the issue isn't, how do we vilify Comcast (or other providers who've instituted caps)? The issue is, how can we figure out a mechanism to allow those who want a huge (or unlimited) amount of access to actually pay for it, while not unfairly burdening the 90% or more of people who aren't what one might politely call "bandwidth hogs"?

Again, with today’s Internet, my parents aren't going to come near a 250 GB monthly cap, nor will I; those who want to consume that kind of bandwidth simply need to pay accordingly. If you’re in the top 1% of those consuming bandwidth, you’re gonna have to buy a first class ticket for all that leg and shoulder room, Dom Perignon, and caviar.

But, what happens when we start to see more HD video coming in over the Internet? A few hours a day of Internet-delivered HD video will destroy a 40 GB per month cap. A few more hours will wipe out that 250 GB cap.

Heck, I can't wait till the day that I can sit in the living room watching my network-attached TV; browse a program guide showing me offerings on my local set-top box, home network-attached storage devices, and broadcast/Internet video; and fire up those video/audio/photo offerings instantaneously, no matter where the content originates. This is the promise of DLNA, the promise of the connected home, the promise of over-the-top HD video delivered via the Internet, you name it. I've seen demos of this type of integrated guide from folks like Macrovision, and I want it. Now.

At that point, hell yes, I'm going to use 8 GB a day of traffic. On any given Sunday, I could envision chewing up 30-50 GB of traffic simultaneously watching multiple channels of baseball or football all day.

If The Wife would let me, of course. Now, where the hell did she hide the remote?

Last night, as I was goofing around with my Verismo PoD, I was pondering how much video I might consume given the opportunity to seamlessly blend a traditional TV viewing experience with the immediacy of watching anything I wanted over the Internet. Sure, video-on-demand has been around for years, but your provider has to have the content you want on their network for you to (reasonably seamlessly) access it--the flip side being that provider VOD comes with the dual benefit of A) not counting against your bandwidth cap, and B) not infringing on any network neutrality issues.

Looking at a different form of video-on-demand, Apple TV has been out for more than a year; the latest instantiation allows you to not just watch Internet video (and enjoy your own content), but also pay to watch movies over the Internet. The Netflix Roku box takes things one step further, not from a capabilities standpoint per se, but because the box only costs $99, making it an impulse purchase for many households.

The Apple and Netflix boxes are particularly interesting in an age of bandwidth caps. By bringing the promise of premium content into the home at a moment’s notice, they all but guarantee additional consumption of content from, and revenue for, Hollywood studios. I haven’t seen concrete numbers on incremental consumption for those Netflix customers who have a Roku box versus those who don’t, but it’s a safe guess that those who have the box are watching more movies, while generating more money for Netflix and the studios--and eating up more bandwidth per month, leading to an interesting future promotion, "three bucks for a movie and a three gig rebate". Or something like that.

Moving forward, does this presage a world where users are hitting their bandwidth caps every month due to consumption of IP video, either via file download or real-time streaming? Great question, particularly in light of announcements just in the last two weeks—Amazon allowing video downloads with just a web browser, Comcast’s addition of downloading to its previous streaming-only Fancast service, Korea moving away from DVDs as a format in favor of downloaded content, and lots more.

Let’s face it...we’re moving to a world where much of the content we consume is becoming virtualized--either the content itself, the experience, or both. Virtualization is the hot thing in enterprise computing, to enable more efficient use of resources; lots of us also use VM software to enable us to run multiple operating systems on a single personal computer, too. How else would I play Vista Spider Solitaire on my Macbook?

Content consumption is little different. I don’t recall hearing anyone describe the Roku box as a “virtual Netflix DVD”, but in effect, that’s what the experience delivers. In exchange for $99, you get the (admittedly limited selection of) content immediately, without the hassle of little red envelopes and the latency of the postal service—or the need to even own a DVD player. A few years ago, the idea of “virtualized movies” over the Internet was a pipe dream (pun intended). Now, I look at a capability like this and say, sure, this could be in every broadband home in a few years. What’s next? Virtualized high definition broadcast TV, where you don’t need an HD service provider? Virtualized real-time gaming, where you don’t even need to own a console? Heck, if you can dream it, somebody will figure out a way to do it. All you’ll need is a TV and a decent-sized low-latency Internet connection.

HOWEVER…

Bandwidth caps are going to severely limit the ability for broad consumption of many of these new applications. Does this “limit innovation”, as a number of folks have said? No, not at all. Innovators innovate. These transient speed bumps are nothing more than that--innovators figure out how to evade or avoid them. Those who are stopped by speed bumps aren’t innovators. Plus, they’ve probably chopped their cars a little too much.

But, is it fair to say that bandwidth caps can limit applicability and adoption? Yeah, definitely.

So, what’s the solution here? I believe that we’re destined for at least a few years of tiered service offerings that closely resemble your choices in airline travel. Lots of households (e.g., my parents) will choose a basic plan, with a relatively low cap at relatively low prices. That’s relatively no frills economy class, a la Southwest (although that’s not intended to be a knock on Southwest, whom I believe is now the best U.S. carrier, all things considered). Some folks will choose a premium economy class, providing either bigger pipes, a higher bandwidth cap, or both. Service providers who offer triple- or quad-play are in the catbird seat in this category. I could certainly envision a provider offering customers a higher cap or bigger pipe if you also take their digital phone service, television offering, or mobile offering. The analogy? United’s Economy Plus.

For those who want/need the next level up, we move up to the realm of business class service. Think Virgin’s Upper Class--a service head and shoulders above economy, and considered to be well above the competition’s, too. We’re talking quite a bit more money, with quite a bit more benefit. That benefit might come as a result of simply paying more, or you might receive it by buying the full triple-play bundle from your provider—the latter resembling the use of an earned upgrade to get you into business class.

The piece de resistance ends up being first class on a true luxury basis, a la Emirates or Singapore Airlines. For an enormous, massive, ludicrous premium over other classes of service, international first class passengers receive the absolute best possible service an airline can provide. Similarly, the first class offering for service providers would be a big pipe (think 50+ mb/s, via a fiber or DOCSIS 3.0 link) and no bandwidth cap. At that point, the top 1% of traffickers would be paying their own freight, while staying the heck out of the way of everyone else. For the price, first class passengers get a limo ride to the airport, separate check-in and lounge areas, a personal concierge to walk them to the airplane, the utmost level of service on the flight, a shower on the other end, and a whole bunch more--for 20-50x (or more) the price of a coach ticket.

Do I think that a service provider's ultra-premium bandwidth offering will cost 20-50x that of basic service? No, not at all. But, I think it's entirely reasonable for a standard (economy) rate to be ~$25 a month, with tier 2 (premium economy) at ~$50 a month, tier 3 (business) at ~$80 a month, and tier 4 (first) at $200 a month, at least in the U.S. Charging much more guarantees that the U.S. will continue to end up towards the bottom of the top 20 in terms of broadband value. But, not doing anything will continue to have the few screwing up the party for the many. I certainly envy the Japanese, who can get a 100 mb/s link into the home for ~$30 a month, and yearn for the day where we might see a similar offering here.

Two final thoughts to keep in mind.

One, Moore's Law will continue to be a double-edged sword. On the one hand, network gear is going to continue to wring the most possible bandwidth out of a given medium, whether it's fiber or copper. That's great for consumers and providers. The flip side is that all that computing power in the home and accessible via the network is going to continue to put greater and greater demands on the network itself. It's an endless cycle--nature abhors a vacuum.

Two, a simple pendulum will always search for equilibrium. That's in effect what's happening here. Think of bandwidth caps as gravity, an attempt to bring the highest 1% of traffic (ab)users down to earth and into equilibrium. In air, a physical pendulum is affected by atmospheric and mechanical forces--which is precisely what bandwidth caps are.

A drag.


(Full disclosure: I have a consulting relationship with a company in which Comcast has invested; these opinions are solely my own, and in no way necessarily reflect those of my customer or of Comcast)

Thursday, September 4, 2008

The Google Backtrack, and an Interesting Coincidence

Old news by now, but within an hour or so of my post yesterday (and entirely coincidentally, lest I think I have any influence whatsoever anywhere in the world), Google put out a statement that they were going to be pulling back on their Chrome EULA. What I still don't understand is who the megalomaniacal lawyer/s were that thought this was ever a good idea in the first place. This doesn't strike me as a case of accidentally re-purposing one EULA to another.

I mean, this isn't a rarely used application that prints certificates for Cubs World Series attendees. It's a friggin' browser, for Pete's sake. We've all seen way too many cases where Legal and PR don't get together up front, and end up having to endure a firestorm on the back end. But, this is a rare hiccup for Google Legal, one which the PR team is gonna have a tough time erasing anytime soon. The good news for them is that most of the world A) doesn't read EULAs; B) doesn't care about EULAs; and C) lives outside of the geek realm y'all and I live in and probably hasn't even heard about this, so impact is relatively limited.

And now, the interesting coincidence.

As you likely know, Google Alerts is an awesome clipping service. Alerts are typically in my inbox within 15-30 minutes of something hitting the Internet--not just the wire, but generally anywhere on the Internet. That's awesome. I have an alert set up for "mike coop", with my name in quotes to ensure that when my name shows up, I get a ping. This obviously wouldn't work if my name was John Smith, but having a fairly unique name makes it easy to A) ensure that my blog entries have properly posted; B) keep an eye on anyone referring to me, more of a precautionary measure than anything else; and C) see what my fellow Mike Coops are up to, whether we're talking the race car driver, the alderman, or others.

When I post a new item on my blog (hosted by Blogger, who's owned by Google), I typically receive a Google Alert within a half-hour. Yesterday's post on Chrome? Nada. No alert whatsoever. Kind of interesting.

Somewhere, Oliver Stone is nodding.

Wednesday, September 3, 2008

Google Chrome Review in One Word: EVIL

I was intrigued by Monday's news that Google would be releasing a browser yesterday. While I used both IE and Firefox regularly in my Windows days, I'm exclusively using Safari now that I'm back on a Mac. Safari isn't perfect, but I really don't need more than one browser. Since Chrome is currently available only on the Windows platform, I couldn't give it a shot on OS X, so I cranked up my Fusion Vista virtual machine and installed Chrome.

That's as far as I got.

I'm the rare person who occasionally reads the End User License Agreement, just to ensure that organizations aren't sneaking in anything nefarious--surrender of first-born, follow-on goat hexes on the Cubs, etc. The Chrome EULA is about as scary as anything I've ever read. Here's the part which should chap your ass:

11. Content license from you

11.1 You retain copyright and any other rights you already hold in Content which you submit, post or display on or through, the Services. By submitting, posting or displaying the content you give Google a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to reproduce, adapt, modify, translate, publish, publicly perform, publicly display and distribute any Content which you submit, post or display on or through, the Services. This license is for the sole purpose of enabling Google to display, distribute and promote the Services and may be revoked for certain Services as defined in the Additional Terms of those Services.

11.2 You agree that this license includes a right for Google to make such Content available to other companies, organizations or individuals with whom Google has relationships for the provision of syndicated services, and to use such Content in connection with the provision of those services.

11.3 You understand that Google, in performing the required technical steps to provide the Services to our users, may (a) transmit or distribute your Content over various public networks and in various media; and (b) make such changes to your Content as are necessary to conform and adapt that Content to the technical requirements of connecting networks, devices, services or media. You agree that this license shall permit Google to take these actions.

11.4 You confirm and warrant to Google that you have all the rights, power and authority necessary to grant the above license.

Seriously? I mean, seriously? I'm sure I've violated some kind of law just by copying and pasting this text here onto my (Google-hosted) blog. But, c'mon. I get to retain my copyright to stuff I already own (mighty magnanimous of you), but Google gets to pervert it any other way possible? This EULA makes Mapplethorpe's stuff look pedestrian.

Let me get this straight. If I have a photo of someone in Picasa, a photo I took, just by viewing it in Chrome, you Mr. Google receive a perpetual, irrevocable, worldwide, royalty-free, and non-exclusive license to reproduce, adapt, modify, translate, publish, publicly perform, publicly display and distribute any Content which you submit, post or display on or through, the Services. (emphasis mine)

Seriously?

Sure, section 9.4 of the EULA claims that Google has no right/title/interest and/or no IPR to your stuff. And, sure, if it's on the Internet, people can get to it. But having content on the Internet is one thing. Google stepping up and giving themselves the right to pervert that content just because you viewed it in their browser is something else altogether.

I printed the EULA to PDF while I was installing Chrome, but didn't actually read it until after I'd installed and launched the application. Luckily, I hadn't done anything but fire up the app before getting into the EULA idiocy. I immediately deleted Chrome, scrubbed my registry, contemplated throwing away the entire virtual machine (but chose to only roll back to a snapshot), and wondered which of the Sony rootkit genius lawyers Google had hired to advise them on this EULA.

Then I took a hot shower, just for good measure.

I really, really like Google, but I like them less than I did yesterday. I host my blog with Google, I use Google Apps for Your Domain, I use Picasa, yadda, yadda, yadda. I've contemplated paying for Google Apps Premier, just for good measure.

But, whatever happened to "Don't Be Evil"?

Seriously.

Friday, August 22, 2008

New Jawbone battery life

Man, Thursdays suck. Don't get me wrong--generally, they're great, because you're past Hump Day, with only one more day till the weekend. What I really mean is, Thursdays for me require somewhere between six and ten hours on the phone for various standards and customer calls. If I'm at my desk, I do some/most of these calls from a landline. However, if I'm on the road, that means I'm on my BlackBerry and my (no longer so) New Jawbone.

Most days, the 4-4.5 hour talk time of the New Jawbone is sufficient, but on Thursdays, it's a non-starter. So, I've charged up the old Jawbone, and intend to use the original on Thursday from here on out. I used to get 7+ hours out of the original Jawbone when it was new; if I can get 6 these days, it'll be more than worth the extra heft. That said, putting the old Jawbone in my ear feels like the comparative difference between a regular carhop order and something that Fred Flintstone would order.

One more annoying thing--when the New Jawbone dies, there's no dying-man's-last-gasp--i.e., a tone to let you know that the battery's conked out. Nope, just pure silence; I can never tell if that's attributable to the noise canceling capabilities of the other party's headset, or just me being dead in the water.

Yep, the latter. I'm throwing my BlackBerry wired headset in the car just in case...

New Jawbone Carrying Case

I receive a lot of questions about how I carry my Jawbone. Unfortunately, the answers are A) in my pocket, or B) in an outer pocket on whatever bag I'm carrying at that moment. I yearn for the days of my Plantronics 640 and 655, with their nifty carrying cases which clip right into a pocket or shirt placket.

That said, I can make one recommendation for a way to protect your new Jawbone (or other small devices) when traveling. From the never-send-a-boy-to-do-a-man's-job category, the OtterBox 1000 from the eponymous company offers an unparalleled level of protection. If you're a suspenders-and-belt person, the OtterBox 1000 provides an awesome way to mitigate the risk associated with carrying an expensive headset in your pocket/bag. Plus, the 1000 can make the blind-feel-in-bag-for-headset exercise a thing of the past. All too often, I end up digging around in my bag once the plane has landed, trying to locate my Jawbone. When I travel with my 1000, I never have that issue. The 1000 is also fantastic for carrying compact digital cameras; my Canon SD700 IS fits perfectly inside the OtterBox 1000 case. If I'm on a boat or otherwise near water, my camera's in my OtterBox 1000.

Speaking of OtterBox, long-time readers will recall how much I (still) love my black Otterbox Defender case for my BlackBerry Curve. The Curve itself is a fairly delicate and pretty expensive piece of kit. I drop my Curve about once a month, whether I need to or not. Not to jinx myself here, but the Defender has saved me every time, along with providing sufficient heft that I can type without worrying that the device is going to fly out of my hands.

I doubt that OtterBox would make a case one-third the size (the OtterBox 333.3?) of the 1000, but if they did, it'd be an awesome carrying case for a Bluetooth device. The 1000 retails for $11.49. I'd pay that for a case one-half to one-third the size of the 1000; at less than $10, it'd be a no brainer.

So, OtterBox, maybe there's an idea for you--the OtterBox 500, at half the width and two-thirds the height of the 1000, priced at $9.95, with the goal of carrying compact Bluetooth headsets securely and safely...

Thursday, August 14, 2008

Not that the world needs more protocols...

But how about something like the Home Network Time Protocol (HNTP)? I just walked between three rooms in our place, and between five digital clocks, they all show different times. All our Macs are perfectly synchronized to the second thanks to NTP. I wish every digital clock had the capability to similarly sync.

I vaguely recall a technology a while back which offered RF synchronization with some kind of baseline clock (the terms "NIST" and "cesium atomic" come to mind), but I guess that never went anywhere. I'd love to see it come back...