Why *not* develop for Apple platforms?
On the occasion of WWDC starting today, I thought that I’d write some thoughts down as to why, even as a fan of Apple platforms, I still would not want to be a developer on those platforms.
The old days
I have to admit that I am not a long-time Apple fan in the “bleed six colors” sense. Back in the 1980s, I was more of a hardware hacker with too limited a budget to ever seriously look at Apple products. In the 1990s, Apple had drifted into comparative technical obsolescence, so it was really hard to get excited about the mac. I occasionally used one at work, but world had more or less caught up with the basics of the user experience and the classic MacOS technical underpinnings were stale. There were a multitude of operating systems that were technically more exciting and good enough from an UX perspective, so why pay attention to the also-ran in the market?
Getting on board
Come the 2000s and OS X, things have changed. The new Intel macs provide an on ramp to the platform with a built-in escape hatch (if all else fails, we could always fall back to Windows), so I’m one of the early Intel mac adopters (yay for the white polycarbonate MacBook!) and haven’t looked back – as a user – since. Professionally, though, it has never made sense: the kind of software I worked on was always technical, but never important enough to the customer to drive a platform switch. It was always “put your thing on our corporate standard PC”. Later on, web-delivered applications didn’t really care about the client computer – as long as there was a browser, it would run.
During this whole time, the mac remained a great developer machine – whether developing for Windows (in a VM) or Linux servers (the UNIX underpinnings of modern macOS help a lot), the mac was a great place to work, but never the development target.
Looking at the applications I used on the mac, I could see that there was a really awesome combination at play: a solid operating system, frameworks like AppKit that brought fundamental improvements to the developer experience and human interface guidelines that you could create usable, productive software without a massive investment in programming or design effort. I still didn’t have a work reason to make anything for the platform, but the productivity advantage for technical applications seemed real. The barrier to entry for mac application development was pretty low – tempered by the then-small market.
Enter the iPhone
The iPhone changed things fundamentally. Yes, in all the ways we know, but also it changed the expectation of what an application should look like. Over the next few years, the expectation of highly-designed, highly-polished (as opposed to usable) negated the productivity advantage. This permeated other software platforms, of course, but it shifted the baseline of what people expect from software (even line of business enterprise software) in an expensive way. Thankfully, iOS 7 and some of the more recent UI paradigms (although we may see something different at this year’s WWDC) put a damper on the design arms race around some of the more expensive trends, but I get the feeling that the baseline application for an Apple platform got more expensive to develop over the years.
Simultaneously, the iPhone – by virtue of its scale – devalued third-party software. With millions of installations, even complicated software can be sold pretty cheap. However, the expectation that all software is cheap has made low-volume applications extremely hard to sell for what they cost to make, even if they are worth it (i.e., bring commensurate user benefit).
The iPhone also introduced a lot of walled-garden practices. People like to complain about the revenue split requirements, problematic as they are, but they aren’t really the fundamental problem. I think that the real problem is that Apple introduced itself as a gatekeeper between the developers and their customers. They gave themselves the right to deem an application unfit for sale, based on ever-changing interpretation of App Store rules that now means that anybody thinking of entering the Apple ecosystem as a developer needs to factor in that their application could be rejected for… well, no good reason.
In plain terms, it has dramatically increased the risk of developing for Apple platforms. When the platform owner has the ability to unilaterally block applications from sale for nebulous reasons that can’t be determined in advance – I specifically do not mean gross violations of their basic App Store rules, I mean when reviewers become arbiters of what functionality is “good”, sufficient and allowed – with little or no recourse.
Under these circumstances, who wants to spend a non-trivial (and increasing) amount of development effort making something that the reviewer of the day may decide has no place in the walled garden? Compared with the acceptance roulette (repeated with every new submission), the rent-seeking demand for a revenue percentage pales in comparison. As an outsider looking at the Apple platform market, rent seeking is is a fact of life one can plan for (although it does make the market less attractive on the whole). What is not manageable is the uncertainty. Not knowing whether an application will be allowed until you spend significant resources and opportunity cost to build it is going to remain the biggest barrier to entry for new developers that aren’t already committed to the platform.
I’m sure that Apple feels that the intrinsic value of access to that walled-in market is motivation enough, but the truth is that small businesses growing organically have only so much tolerance for risk. Add to that the expectation that all software is now “cheap”… Why would anybody bother?
This is something that Apple has the ability to address. Whether they want to or not is another matter.