There's been a lot of banter about Apple's refusal to support Adobe's Flash on its iWhatevers. Adobe weighed in unofficially and now Apple's Steve Jobs penned a letter.
Mr. Jobs gave the expected reasons for Apple's refusal to support Flash, and all were well articulated. I personally appreciate Apple's stance because I think Flash has to go, but simply find it ironic that Apple is pushing for an open web in this case. Mr. Jobs, although you skim over apps in your diatribe, we know the full story. Proprietary apps that only work on your devices do their fair share to stifle the open web. But I do understand that going this route allows you to offer a clean experience that's tightly coupled with the device right now, and I'd like to have faith that you will one day evolve this situation along with evolving web standards, and eventually abandon device-centric apps in favor of the open web. How and how quickly you do this will tell me a lot about your true interest and relevance in the open web you claim to espouse.
Many (including Adobe) would argue that Apple is unnecessarily being picky about the way things are built for their platform. Flash, they say, is a framework and language that has great tools and a great following so why not support it?
Well, for one, if you think Apple should simply give you choice with little other consideration, you're asking them not to be Apple. Apple is defined by their walled garden approach (see my previous post), making the experience better through thoughtful restrictions. Whether it's limiting their OS to their own hardware or deciding which apps you can run on your device, this approach arguably makes their products better and more accessible for many. Jobs articulated several reasons why Flash will impair the experience, both for the end-user and, they believe, for their developers. If we don't like it (I happen to agree), we can only hope for or create other choices. (And we have other choices with Android, thanks to Google.)
I think it's mostly a happy byproduct that Apple finds itself appearing to protect and support the open web, but they're right to "believe that all standards pertaining to the web should be open", and that is the most interesting part of this discussion. So what if Flash has great tools and a following? That alone does not determine what should be an accepted web standard. Microsoft also happened to have great tools and a following -- but ActiveX was a nightmare that nobody respectable would argue should be a web standard. Web standards are critical to the evolution of the web.
Open web standards are evolving (with HTML5 and beyond) to include native video, animation and 3D and it's imperative that this happens. Apple may have beef with Adobe and their own inane or justified reasons for not supporting Flash in the meantime, but they're in a better position to do this because the web development community knows that Flash must go in favor of web standards and so any uproar is therefore muted because Apple slyly appears to be working on their behalf by not supporting Flash.
Flash, in its current form, will go and must go. I agree with Steve Jobs (woa) that Adobe should focus on evolving their great tools to support web developers and designers who are creating video, animation and 3D for a web that is based on open standards.
The argument continues: but what if someone likes Ruby, Python, or C#? Apple is shooting themselves in the foot by being picky about the way things are built for their platform.
Well, then you'd have to say that the open web is shooting itself in the foot by being picky about the way things are built for its open platform. And that would be ridiculous. If you don't think there need to be web standards, you're on the wrong side.
Do you see what's going on here? Jobs has cleverly shifted the discussion about how open Apple should be to a discussion about the open web and what open really is. In other words, if the open web favors and adopts certain languages and methods in the pursuit of creating a "standard", how can you fault Apple for favoring and adopting certain languages and methods to maintain its standard?
Flash offers client-side web based features and functionality by way of its proprietary, ubiquitous plug-in. Really, the only thing similar is Java, specifically the Java plug-in (not the programming language), and iWhatevers and other devices don't support that either!
Sure, you can create an application in Ruby or Python or C#, heck you can create a desktop application in VB or C that connects to the internet, or an iPhone app in Objective-C that pulls in web content. But these are all outside of the web browser, and they are not web applications by accepted definition. There is nothing even remotely ubiquitous that allows for Ruby to run on the client (i.e. in the web browser), and the same is true of Python and others. The same is *not* true of Java or Flash -- both are unique in that the plug-ins (the runtimes) that allow for them to function on the client side (in a web browser) have been popularized and are ubiquitous. This fact means developers can in fact consider relying on Java or Flash (Flash being a far easier call than Java) for certain client-side features of a web application, and this is *not* true of Ruby or Python or C or Objective-C or others.
Flash is not a web standard. Java is not a web standard. Web standards are necessarily evolving to incorporate what those plug-ins offer, natively. And that means we are choosing to move away from selective dependencies on external plug-ins in support of an integrated, native experience that every web browser can incorporate simply by following relatively light-weight web standards. We are agreeing on an open platform we can all implement and rely on, with no necessary dependency on proprietary or isolated components. With respect to where a web application must interact with external components, say the hard-ware specific accelerometer on an iPhone, there is a standards based way of approaching this where you can detect what capabilities exist on the client and write special conditions to handle that (the same way you'd detect resolution, or detect if you're on a mobile device, or detect if the browser has Google Gears installed for offline functionality).
Still the argument goes on: but allowing other languages and frameworks on your platform is a way to win over those developers and get new apps. Saying no to other great languages and runtimes is bad.
Well, is it bad for the web? I think not. In the world of the web, an open platform owned by no one company, the dynamics are totally different. There are standards for the client side part of the equation. What you do on the server side (or on the development side, i.e. which tools you use to produce the client-side code) is up to you because that's isolated to your environment and the world wide web doesn't need to care so long as you send over the stuff it needs to see and interact with.
Web standards exist to manage the client-side experience. Anybody, anywhere, on any device should be able to have a similar experience. And it's all working over a global network. Given the complex, delicate nature of this open platform, there must be web standards. If web standards did not exist, none of the implementors of the client-side experience (web browser makers, device makers) would know what to support, some technologies wouldn't play well with others, some may be too intensive given network and device constraints, developers wouldn't know how to spend their time, people would be seeing different versions of the web given different client-side implementations, and the whole damn thing would be broke! In this way, there is no comparison to pre-web platforms like Windows.
Now, while Apple can point to the open web to help justify Apple's refusal to support Flash, the correlation is limited.
For one, web standards are thankfully determined in a pretty practical way. Sure, there's the W3C which is comprised of anybody that wants to be a part, and there are community discussions and decisions, but there are also simple market forces at play -- i.e. if HTML5 catches on and there is widespread acceptance among the development community of it as a standard, then implementors (i.e. web browsers) can support it, even before it's ratified as a standard. When there's enough support, developers can leverage these features in the client-side experiences they're building. And the web moves forward.
And web standards, in fact, do have a standard way to support non-standard components -- via a plug-in architecture and things like the object tag. The iWhatevers, on the other hand, do not. But Apple can refuse those non-standard components and still claim it's supporting an open web. As for *why* it would eschew Flash or other languages and frameworks at potentially great expense (e.g. sites don't work and not as many developers are attracted), refer back to the many other reasons Jobs described including the unsaid one that defines Apple: walled gardens make for better experiences in their estimation.
I welcome greater adoption of HTML5 and web standards, from Apple or whomever. I want Flash to go. But Mr. Jobs, if you are rallying behind the open web and expect us to forgive Apple's closed nature and forced restrictions because of it, get out from behind the veil and get fully on board. You may not be able to open up everything right away because of important factors like experience, and I respect that, but when legitimate ways to open up further present themselves (for example web standards evolve to match the speed and device integration of native iWhatever apps), make the right moves. I believe your company's future depends on it.
To an open web.