Understanding the outrage at ISP misbehavior by analogy

There have been plenty of articles recently like this one at Ars Technica describing the attempts by US Internet service providers to double-dip in their billing (charging both the consumer and the high-traffic websites they frequent). The reactions are usually outrage at being taken advantage of or defense of the principle that businesses can charge whatever and whoever they want for services... And in some way, both are right -- but I think that a simple analogy can illustrate how these ISPs are treading on some dangerous ground ethically, if not (yet) legally:

For most people in the US, Internet service is like living on an island with just two bridges to the mainland. Usually, one of them is a whole lot more convenient than the other (it gets you to usual destinations a lot faster, etc.) -- so, while there is a choice, it is a very limited and unequal choice. Through some quirk of billing, these bridges charge a flat monthly fee for a pre-agreed numbers over them in a pre-agreed vehicle size. So far, so good.

Where it gets interesting is when the owners of the most convenient, widest fastest bridge decide that, to supplement their income, they will charge a fee to the stores on the mainland that the bridge customers, visit and only make it easy for vehicles to reach those that do... Even more interesting, the bridge owners actually own some of these stores. For the most part, nobody notices anything, but some people are starting to question why the bridge ends in a two-lane road unless they are going to barbecue-r-us, the rinkydinkville general store or the godawfulplex movie theater, say. They can't help but wonder why the small fortune they pay to drive their Suburban across the bridge doesn't allow them to get Chinese for dinner, shop at Target or watch the movies at the good theater?

Can you really expect the bridge owners to keep their doorsteps pitchfork-free, even if the town sheriff can't think of a reason why it would be his problem?

 


Something is rotten in commodity-networking-ville

Reading this piece on Anandtech.com today was a pretty grim reminder not so much of what Apple does right (in this case, they do quite a bit) as much of what the race-to-the-bottom commodity networking hardware market has become. We're on the cusp of wide availability of a new networking standard (802.11ac WiFi) that has all sorts of great potential but that is basically a terrible mess because nobody has any incentive to actually do a good job implementing it. Everybody slaps out some variant of the chipset manufacturer's reference design, does a quick "branding" job on the associated firmware and out it goes! Six months from now, there won't be any bug fixes or updates to the firmware -- it's all abandoned for the next model.

This is incredibly frustrating as a consumer and, basically, the choices for anybody that doesn't want this kind of junk are either "Enterprise" vendors (which isn't terribly practical for all but the most technically adept) or Apple. Neither guarantees that they are bug free or terribly featureful in ways we care about, but they aren't soon-to-be-forgotten junk. I love when enthusiast sites declare things like "Apple's New Airport Extreme Offers No Innovation" -- it's so true... Because having useful bugfixes and updates for the next couple of years is not "innovative".

There is no doubt that the purchase price of (real) Cisco or Apple networking hardware seems out of line with the commodity choices, but at some point you do realize that this means something approaching a sane business model.


Futurism myopia

I was listening to this week's "The Talk Show" podcast with Om Malik as the guest (a good episode, BTW) and I was struck by the picks Gruber and Malik made at the end of the show for what they think the (near) future holds for Apple.

Not that their choices were wrong, but that they focused on i-devices and iCloud exclusively. I think that technology companies (and especially Apple), tend to follow trajectories with their product lines -- all their product lines. It's too easy to get fixated on the hotness of the moment (mobile devices) and ignore the boring other stuff. In actuality, what seems to be happening is that Apple -- unlike most of the PC market -- has a vision for what computing will be in the not too distant future where mobile devices and traditional computers will come to intersect again.

The huge effort to make Retina-class MacBooks and redesigning what a desktop computer looks like (the razor-thin iMac), combined with a constantly evolving software experience and better ways of connecting the boxes (Thunderbolt and, of course, upcoming 802.11ac wireless) makes me think that there is a vision for what a "conventional" computer is going. It's not just "the same, but prettier". If anything, I'd say that it is more "it doesn't get in your way" -- that the computer disappears from your desk just like the iPhone and iPad disappear in your hand leaving the software experience. We're a long way from that, but the trajectory seems pretty clear.

Similarly, the iPad -- while incredibly useful in its original size -- was on an obvious "more sizes" trajectory, too. Smaller was easy. Bigger is going to be hard. Bigger, more useful, while remaining thin and light enough to be handheld is really hard, but the evolution of the software experience to drag iOS into something that makes sense in a bigger form factor will be even harder.

While it's true that we're going to see a constant evolution of mobile devices and that iCloud needs to get a whole lot better on an Internet schedule, the question that comes to mind is how do the computers on our desks and in our hands come closer -- not necessarily merging into one thing, but all getting out of our way better and (I suspect) working together better. Taking Windows 8 as the counter-example, we see that the "turducken" of merged mobile and desktop experiences is a train wreck. But we also see that just doing that doesn't make them work better together. I can't fling my spreadsheet from by desktop to my tablet and back, I can't edit the numbers on one and visualize graphs on the other... There is a long way to go with the software. I think that Microsoft, Google and Apple all intuitively understand that, but only the latter two appear to actually be making progress towards it (in very different ways).

So, what do I expect -- or wish for -- from Apple in the next few months to couple of years? I hope to see how this vision fleshes out. We're not going to see the complete realization of any fully-formed pieces, but we should expect to see the trajectories of mobile and conventional computers become more apparent.


Windows 8 sanity-saving tip: bring back the Start menu

Windows 8 on a desktop PC has been driving me crazy ever since the preview versions. I've been trying really hard to give the "modern UI first" philosophy a chance to grow on me, but it hasn't. If I were using a Windows 8 tablet, like the Surface, I'd be fine (the swipes and hot corners are actually quite tablet-friendly) but, on a conventional PC, it is brutal.

On a conventional PC in a virtual machine -- either with a screen to itself (in a multi-monitor setup) or, worse, windowed -- it is completely awful because the mouse usually has somewhere to go beyond the edge of the screen, making the corners especially tricky to hit. I realize that this isn't how most people experience the operating system, but I do (99% of my Windows usage these days is in a VM)... I don't think that this fundamentally changes the problem, it just makes it more painful.

So, what to do? I've had enough. Clearly others have, too. Thankfully, there seems to be quite the industry around restoring decent desktop behavior to Windows 8. I purchased Stardock's Start8 application, but there are plenty of others, too. All I can say is: if you use Windows 8 on a conventional PC, run -- don't walk -- to get a start menu restoring application like Start8! The usability difference is night and day.

It isn't just the start menu, of course, there are other subtle tweaks done by these applications, but the overall effect is to make the experience far more productive. I don't mean that in the fuddy-duddy sense of "Windows 8 is different, therefore it sucks" -- I really think that it is quite an advance, but they made a horrible misstep by basically forcing everybody to have a tablet-like experience even when it makes no sense. On a large screen, the full-screen "Windows 8 style" applications are humorously inappropriately sized. If I were to spend any significant time using them, I think that I'd seriously look at something like ModernMix to fit them in a window, too.

As I mentioned above, Stardock aren't the only game in town. I just happened to have a history with the company (all the way back to using their OS/2 products -- boy, that was a long time ago!), so it was the obvious place for me to shop. Your mileage may vary, as they say.


Now, they have me worried...

For months, journalists and "analysts" have been crowing (yes, crowing: they are positively gleeful about it) that Apple is done. Kaput. The genie is gone. There is no magic left... That's been pretty entertaining on the whole and mostly good insight into the reporter's thought process: when the one company that truly cares about making their customer's lives better (while making a lot of money at the same time, of course) is the anomaly, they can only conclude that it is not just anomalous, it is wrong. There is nothing worse than something you don't understand.

When Samsung makes tone-deaf and/or sexist skits at their latest product announcement, there is no problem -- because a big company making lots of money while displaying complete disdain for its customers is normal. We can handle that. Whether we like them or not, Apple is a company that has values. I think that this is really what disconcerts most observers about them and is reassuringly familiar with Samsung today, the Bill Gates-era Microsoft in the past. They can't (couldn't) be trusted and that's what we expect from a hungry company.

Through all of this, I've always had the impression that Apple themselves knew this and weren't too bothered by it. Now, I'm beginning to wonder if they are starting to waver in that belief: Phil Schiller's oddly-timed and sloppy interviews on the eve of the Samsung Galaxy S4 introduction makes me think that somebody in Apple's brain trust blinked. Somebody got worried that maybe the press and the analysts are reporting the truth, that the competition is actually winning... Which is the worst thing the company could believe: if there is one aspect where Steve Jobs will be missed at Apple, it is his unwavering belief (and ability to convince others) that they are right. That they are right to do the legwork to figure out what the customer's problem is. That they are right to care about every little detail of the product. That they are right to try things that nobody else has done... Because that is how you win.

If they don't believe that, they lose the one thing that separates them from the rest. It is the difference between doing right by the customer and simply being greedy. Apple has, for the last decade or so, been totally focused on the former while observers have assumed it was the latter. It's fine to worry about what the others are doing, but marketing by looking in the rearview mirror just isn't confidence-inspiring... It's for losers.


Lies, damned lies and... Comcast?

Having dumped Comcast cable TV service at our house a while ago, I had not expected to ever have to deal with their consumer sales organization again (I've had good luck with their "business class" people, though). So, I was somewhat morbidly intrigued when I called them today on behalf of somebody else to navigate through their consumer Internet service offerings. Well, they haven't changed their ways. Bargaining with the Devil is a warm-up for calling them:

 

Me: "What service tiers are available at this location?"

Comcast: "our most popular service tiers are X and Y"

(Nice non-answer there: don't tell me what is available, only tell me the most profitable ones. Classy as usual, but not unexpected.)

 

Me: "So, what is the long-term monthly cost for X?"

Comcast: "$24.95"

Me: "Is that an introductory price or is this the long-term price after the introductory period has passed?"

Comcast: "That's the introductory price, after six months it goes up a little"

Me: "By how much?"

Comcast: "The monthly price goes up by $20. After the introductory period, it goes up to $44.95"

 

Well, that's beaten my every expectation for borderline dishonest sales pitches. I expected the sales guy to "forget" to tell me that his quote was an introductory price. I expected him to weasel around telling me all the offerings actually available. I am just really impressed that he quantified an 80% price increase as "a little." I don't think I would be able bend the truth into a pretzel like that. 


Did we see the same movie?

Come Oscar night, I was quite interested to see that Zero Dark Thirty had become such a toxic movie -- the movie that shall not be named, virtually... While the things depicted in the movie are clearly awful, I thought that it was a really good film. Then I read this piece by Glenn Greenwald and I wonder if we even saw the same movie?

Obviously, he's not alone in his opinion... But I just don't agree with them -- what I saw was the story of the moral journey the nation went on after 9/11 as personified by the character of Maya. The awfulness, the torture, even the willingness to undermine things as vital as polio vaccinations in the third world in the quest to catch one man are presented as events that happened (they did -- maybe not exactly the way depicted, of course). It would be a lie to pretend they didn't and it would be a lie to say that they didn't in some way yield results... That doesn't make them any less wrong. It doesn't make the underlying question less valid: did the ends justify the means?

Much like the underlying uncomfortable and unanswered question, the rest of the film is unconventional in its reluctance to telephone ahead how the viewer should feel -- the twists and turns, including the violent setbacks come as shocking surprises. There are so few movies that do that these days. The willingness to let the viewer experience the events without being told what they should feel is the brilliance of the storytelling and why it should be seen. Hiding the movie under a rock because it doesn't have a moral commentary layered over it is myopic: the reaction to the story should be yours.

 


Mr. Market is... a finance guy

Seeing this post by Jesse Felder linked by John Gruber, I think that one thing needs saying: the market is populated by finance, not product, people. A company that relies on superior execution and products to continue succeeding is the worst kind of company for finance guys. You can't measure "being good" with a stick. That's why movies are terrible investments, that's why Apple is only ever going to be tolerated as an investment only because it makes so damn much money.

The best kind of company (for finance guys, at leat) is one that doesn't have to be good, that extracts rent from all consumers whether they like it or not. You know, like Microsoft 15 years ago... Good times!

Update: To clarify, Microsoft about 15 years ago -- because virtually every PC sold in the Western world came with Windows (and most business PCs also came with Office) -- was basically an index fund on a then-booming market. The difference between then and now is that the market Microsoft still utterly dominates (PCs) is receding and the growing markets (e.g., mobile) generate comparatively little revenue for them (token license fees earned from Android handset makers are a far cry from the licensing fees on PCs).


Losing the "near-letter-quality" blemish

As somebody that lived through the progression of personal printing technologies from crappy origins (e.g., dot matrix) to today's by-and-large affordable wonderfully accurate printers, it's hard to forget that killer description from the days when the technology just wasn't there yet: "near-letter-quality." It was the manufacturer's way of saying, "yeah, it looks like crap... but we're really trying!"

Sadly, I feel that's exactly where we are with computer display devices now. Give or take the odd exceptions (like the Apple MacBook Pro with Retina Display), whatever display we use to read computer output looks like crap. We know it, but it's the best we could get at the time... Oh, yeah, like the printers of twenty years ago. At some point, "good enough" stops being good enough, and I think that point is getting really close: we're getting so used to modern mobile devices with really high resolution displays that the new "normal" is getting to be better.

I know that there is a strong argument that normal people aren't all that perceptive and that "close enough" reproduction of text and pictures is just hunky-dory for the masses, but I think that change is coming from a somewhat different direction: it isn't that existing user interfaces look so much better on higher-resolution displays but, rather, that the new high-resolution mobile devices are ushering in new user interface aesthetics that just don't work well on the crappy old displays.

Look at Google and Microsoft's reinvention of their respective mobile user interface look & feel... They embody typography-heavy "clean" design languages that really take a high-resolution display to shine. Trying to design user interfaces in that general style but targeting the kinds of lower-resolution display devices on PCs today is an exercise in frustration. It isn't that text isn't readable on these displays, it is simply that it virtually never looks good.

The old-fashioned workaround was to use carefully-sized text in specific typefaces that had hand-optimized renderings, but the choices are incredibly limiting and rarely work for these modern UI aesthetics -- they are a kind of typography time warp where everything parties like it's Y2K... Which would be fine, except that everybody has a phone or tablet that looks a million times better now. People are noticing, but desktop software is going to continue looking like crap until the displays -- the last thing before the user's eyeball -- get out of this "near-letter-quality" guetto that printers managed to leave over a decade ago.