Skip links

Find me on Google+

Google+ lets me blog, tweet and more. I'm increasingly active there, and will one day integrate my blog with it, once there's a good API.

In the meantime, find me there. I opine about Google+ itself and more:

An entirely new converging world

I recently read (in the Economist, my hands-down favorite magazine publication) this great profile of Hans Rosling, the man behind superb data visualizations about our rapidly changing world via his company Gapminder, which Google acquired back in 2007 and made more broadly available as Google Public Data Explorer.

Read the article, but also check this video with gives you a taste of Roslin's passion for opening up this data and making it truly accessible, and more importantly for what it teaches us about our world -- a rapidly changing, converging world in which the dynamics today are radically different than they were not too long ago. I love this kind of stuff, because it enables us to better understand our world, our future, and it empowers us to make a difference, identify opportunities, effectuate change and progress and it ultimately makes the world a smaller place by making us aware.

Beyond e-mail: Facebook Messages, Gmail and the future

Email has been king, but it's in many ways an archaic technology, and attempts to evolve it have been increasing. By some measures, more people now interact via social networking than by email and for good reason. Social networks offer rich insights into who people really are and give you better control of who you really want to communicate with (goodbye, spam). Your friends and their activity are already there, and so it's only natural to start a conversation in the same place. A social network that is big enough is like a global address book, letting you find someone by name to instantly send them a message. And since you've presumably already defined who you care about most, you can see messages from them first.

With 500 million users, Facebook is the ultimate social network and is in a unique position to take this concept forward and create a robust messaging system. Lo and behold, and widely expected,, they just announced a new Messages feature that looks like it gets a whole bunch right. Center conversations around people, eliminate subject lines and make starting a conversation easier and less formal, and even centralize some of the disparity across email, instant messages and texts. And it seamlessly integrates with email, even giving you the option of being reached at an address. It's well thought out and leverages the strengths of a social network that's supposed to know about who you want to be communicating with.

A revamped Messages feature is just as important as Facebook's recent fundamental revamp of its Groups feature, which begins to let you mold your social network based on how you really interact in the real world. Together, these features begin to transform the social network destination into a really useful communication tool.

While Facebook is adding more robust communication tools to its social networking platform, Google is set to add more robust social networking to its communication and collaboration tools, and thus we have what I think is one of the most important battles for dominance in the future we're creating.

There is no question in my mind that Google is working to leverage its Gmail success and is building an integrated communication tool that is beyond email. (I'd wanted to write a post titled "Google Wave is not dead" making the case that the technologies you previewed in Wave will in fact make their way into a more integrated tool -- it was only "killed" because Google quickly knew that it had no future as a standalone and sprawling product.) They need more social glue to be able to solve the big problem that Facebook has just addressed -- that is, they need to know more about the people you want to connect with to be able to offer more control and less spam -- but if and when they do, they've also got a reputation for creating the best productivity tools that they're sure to back up.

Facebook's greatest strength (i.e. that it is a social network) is also its greatest liability. It's a social destination, not associated with the more mature tools you use day to day in your productive lives, and so it'll be a hard sell convincing people to bring the important people and communication in their productive lives into Facebook. If it can do this successfully (and Groups is a big step towards addressing the problem of "everyone's equally my friend") and if it proves it can create more robust productivity tools (they've got nothing on the scale of Gmail, Docs, Picasa), then Facebook stands a great chance of winning in a future that is unfolding, a future beyond "social" networks towards "actionable" or "collaborative networks".

For now, though, Facebook will remain a social destination. By the looks of it, I'll probably do more communicating via Facebook (and many will choose to do the bulk of their personal communications there), but I won't be prepared to migrate my productive communication in the foreseeable future. In the meantime, Google can wow me with a more network-aware Gmail and a more integrated suite of products, ensuring that they remain the leading provider of the powerful, productive, collaborative tools that are such an important part of our cloud-based future.

Who do you think is in the best position to become the de facto center of your communications in the future?

Update: Here's a full set of screenshots of the new Messages experience on Facebook. It's rolling out slowly over the next few months.

The networks can't stop the future, i.e. Google TV

Re: reports that some of the big networks are blocking Google TV from accessing full shows via their websites, the same websites that are publicly available via any other means of accessing the web, my thoughts are as follows.

This is analogous to a situation where the networks blocked your Samsung LED TV because they hadn't worked out an arrangement w/ Samsung. Or blocked your Firefox browser on a Dell machine because they hadn't worked out an arrangement w/ those respective companies.

Google is not slapping ads on or alongside any existing content. If you're on NBC's site on your Google TV, you're seeing what they choose to show you, including ads they control. Wouldn't content producers want more eyeballs on their decidedly public content?

Sure, Google is in a strong position controlling the interface to that content, and the networks fear this just like big industry has feared most every innovation forever. But what do they fear exactly? Google would never rip out an existing content provider's ads and slap on their own -- it would be an outrageous and self-harming move, and there are laws against that stuff (or will be in a jiffy, at least). Are they afraid of a potential deluge of new eyeballs on their web content, pulling viewers away from their currently more lucrative cable content? I understand that they'd like to control the pace of change themselves, but they all know evolution is inevitable and delivery is moving to the web and so figure out a way to harness these changes and turn it into money.

This won't last long. Market forces will prevail and it simply won't be acceptable to block content because of one's choice of browser. I wouldn't be surprised if a law were introduced prohibiting that, though I think legislation would be ridiculous. And when Google TV supports apps, someone may create a user-agent switcher making it hard for these sites to block it.

TV is converging with Web. There may be a ruckus along the way, but it's happening.

Or as succinctly stated here:

[Br]oadcast networks can’t stop the future, no matter how hard they try. TV viewing habits are increasingly trending towards consumers skipping traditional broadcasts and instead catching up via DVR or web video at their convenience. Forcefully cutting off legitimate points of access like Google TV or Apple TV isn’t going to stop users from skipping live broadcasts. The networks’ time would be better spent looking at how they can take advantage of new viewing trends, instead of hopelessly trying to defend dying business models.

The open versus closed debate continues

I'm sure you've watched the recent exchanges. Jobs blasts Google and Android, and the debate about open versus closed (or fragmented and integrated, as Jobs would have it) continues.

I think this nicely nails it:

It's almost a debate of capitalism versus socialism. And I think there are merits to both. :)

"I think it's going to be a challenge for them to create a competitive platform and to convince developers to create apps for yet a third software platform after iOS and Android," Jobs continued. "With 300,000 apps on Apple's App Store, RIM has a high mountain ahead of them to climb."

Comments like that will be fun to look back at in an I think inevitable future where all apps are web based, the entire user experience is in fact in the cloud, and comparisons are about hardware instead. The real question might be, who (Apple or Google) is leading the charge into that future? Leave a comment and opine.

A very basic outline of computer programming

Question from a reader who's getting ready to go to college and considering a future in computer science.

I really don't know the first thing about programming or computer science, so if you could give me a very basic outline of what those two things entail it would be much appreciated.

My response:

Google knows better than me. :)

But here's a go:

Every application (some people call it a program) that runs on your computer (called a desktop application) or on the web (called a web application) has to be programmed, i.e. coded. Meaning, you have to write the logic, the instructions that are then interpreted (i.e. turned into something that works) by the system that the application runs on.

So, for example, Notepad is a simple application that runs only on Windows systems. It was built using one or more programming languages and tools which lets you develop applications for Windows. You need Windows to run that application. By the way, Windows itself is an application, only it's an uber-application of sorts, i.e. an operating system which makes your computer functional and able to run other applications.

The web is a much cooler place to build applications for. One major reason is anything you create for the web just needs a modern web browser to run... and modern web browsers run on every system (Windows, Mac, Linux, even mobile phones). So right away the application you build works for most everyone. Another major reason is that web applications (like Gmail) can be accessed from anywhere, instead of being confined to your computer like a desktop application (like Notepad). That's because the application is really running on some computer out there ("in the cloud" as it's referred to) which serves it to any web browser that requests it (hence it's called a server).

Whether developing for the desktop or the web, there are various "languages", each with their own syntax (style of coding). There are also various "frameworks" or "platforms", which make it easier to write certain types of applications (in certain languages) by giving you a kick start so to speak. They do this by bundling certain core code and providing certain methods that you can then leverage to write your application more rapidly.

Many languages and frameworks are proprietary (owned by one corporation, who licenses their technology and the tools needed to write applications using that language) and many are "open source" (its creators give it free to the world, and many people then contribute to it, making it better for all).

Every application typically has several parts to it, from a programming standpoint. You need a language (i.e. a method of creating instructions) for the logic part of your application (i.e. if this then do this, or if that then do that), which is called a scripting language. You need a database language, for querying the database where all your application's data is stored (e.g. your user's name, their login information, whatever), which you then use in your logic (e.g. if the user is logged in, print "Hello" plus their name to the screen). You also need a display language, because creating nice visual interfaces for your application requires its own mannerisms and programming methods -- so it handles things like font sizes and colors and layouts and graphics and what not. All of this stuff (logic + database + display) works hand-in-hand to form an application. There are many more pieces to the overall puzzle, but that's beyond us right now.

One of the most popular web programming languages (it's a scripting language) is PHP. It handles all the logic for sites like Facebook. One of the most popular database systems is MySQL. Then, of course, there's HTML which handles the display side of the equation for web applications.

Besides HTML, there are some other important pieces on the display side. There's JavaScript, which works within the display side (the HTML) of a web application to handle some logic (e.g. if the user clicks the plus sign, slide out this section of the page). There's also CSS, which also works within the display side (the HTML) and it handles the precise positioning and display of all the elements on the page (e.g. put a 1 pixel border around this element, and position it 5 pixels away from the element above it).

It's a lot of fun -- like building legos, only it can help people and change lives.

I never took computer science, but I learned a lot from mentors who did. It'll give you a firm understanding of how computer systems work, of advanced programming concepts and more. But don't expect it to teach you the languages and skills you'll need to know when you get out of college. For that, you have to play around yourself -- get involved in programming while you're in school, take on a project, contribute to an open source project. Hands on is everything.

Don't let anyone bore you along the way. It's very exciting, very fun, there's many different areas to focus on within it, and it's an ever-growing, every-important part of our economy and of civilization in general.

How's that? :)

My Google birthday cake

My awesome wife, Rebecca, had her cousin Lauren make this dope and delicious cake:

Google cake by Lauren Matalon

Why all the HTML5 hoopla, you ask?

From a reader: I think the case for HTML5 is being a bit overstated. Even if it is widely adopted (which I think it will be) I am not sure how dramatically different the experience will be.. It removes the need for plug-in RIA's like Flash and Silverlight, but that is tantamount to standardization of technology that has more or less been around for years.

From one angle, the whole point is indeed that the experience will be the same. So from a typical end-user's perspective, it doesn't matter a hoot.

It does matter for some end-users from a marketing perspective -- just yesterday I met with a prospective client and they wanted to move their (uneditable, unsearchable, unmanageable) site from Flash to HTML. They'd heard for themselves that HTML itself could do any of the cool stuff like animations, while potentially solving the usual pitfalls of Flash because HTML is more a part of the web -- and they're novice. There's been a perception that if you wanted a cool, "flashy", interactive site (especially for entertainment, fashion, arts) you wanted to go with Flash. That's now going out the door, thanks to fodder from Google and Apple and the broader community, and I welcome that. I would have to explain to clients that they didn't need Flash just because they wanted a rotating feature spot or fading graphics on their website!

And it of course does matter from a developer's perspective and a broader development-of-the-web perspective. The case for HTML5 is largely a developers rallying cry, pushing the community to adopt it and pushing tool-makers to support the standard. Why? A web based on standards makes for a better web -- no more plug-in issues (how many businesses refuse to allow Flash) = less dependencies to worry about, deep integration with every other aspect of the client-side stack = a richer experience, lighter-weight and tightly-coupled to the core standard = easier to deploy and support on all kinds of devices on our fragile web, open versus binary = searchable and editable the web way, standard versus proprietary = legally and forever a part of the open web owned by no one, and so on and so on. Further, it bridges a gap amongst the web development community which is important from a broader industry standpoint.

So the case for HTML5 is a case because that's how evolution and adoption goes. It's as simple as that.

Some HTML5 fun:

Apple's HTML5 examples:
Chrome Experiments:
Full-on Quake game w/ HTML5 (must be a geek to compile):

If you want to try the Apple stuff on Chrome, change your user-agent to trick the site into thinking you're running Safari with this extension:

P2P is the Future of the Cloud (or Why Diaspora is the right response to Facebook)

In response to a discussion about Facebook privacy concerns and what some kids are doing about it with the Diaspora project, I repeated what is a common mantra with me: P2P is the future.

It's just a matter of time. I keep saying it. It needs to happen and it will happen. It will address our issues with giving all of our data to any one big company, with capacity, with privacy, with reliability. Of course, for this to happen complex network infrastructure needs to be open sourced and some things need to be invented. But luckily, the software is what's most important. For us to get there, there need to be major market forces and other incentives at play. One such force is the backlash over privacy we're seeing -- and we'll continue to see a lot more until efforts start being channeled towards open source, decentralized and distributed alternatives. Eventually, they'll be just as good as centrally served experiences, and eventually they'll be better and the current client-server model will be but a memory.

When asked how that fits in with the idea of the cloud, I offered the following explanation.

The cloud is simply a term to describe the idea that the complex infrastructure that serves your applications and your data (i.e. your computing experience) resides "out there in the cloud". It's a marketable term that helps avoid unnecessary talk about servers and datacenters and colocation and content delivery networks. All of the messy stuff from expensive hardware to expensive security processes and more is handled by one or more companies, and you just consume using thin clients (i.e. anything with a web browser and a local cache). It helps us into the future, as people start to understand and accept the idea of software as a service.

P2P describes how some of that messy network stuff can work. The idea is that the network is the computer. Everything -- all the code and data -- that makes up our experience is distributed over the countless nodes that are yours and my and everyone's computers. The processing of all that stuff is handled by the commodity processors in all the nodes (our computers, kiosks, anything sufficiently internet connected). There may be companies and governments that offer supernodes which help bring more resources to the grid. All of this stuff is encrypted and distributed and your stuff is only accessible by you.

And in fact that's how much of this stuff is working right now. Google's cloud is really a private distributed P2P infrastructure. They have commodity servers with certain hardware and software specs and these are geographically distributed as well. They just plug in a new node (server) and it starts talking to the other nodes and it's ready to go. There's no dependency on any one node -- if one node dies, traffic avoids it and a new one is plugged right in. Data is distributed redundantly across the entire global network. You log in from anywhere and you've got access to all your stuff.

Imagine if Google released their P2P network to the world (I'm hopeful) and we all worked on one of those nodes. Well, that's true P2P. We are all contributing and receiving (more or less depending on your node's constraints) on this P2P network. It's a commons of sorts, owned by no one.

Gone is the issue of Google having all our data -- because it's everywhere and nowhere on a distributed, encrypted network. Gone the Facebook privacy concerns -- because you own your information and you decide how you want it to interact with other services. Gone is the issue of capacity -- as the network grows the more nodes are on it. Gone the ability for the government to subpoena all your search queries from Yahoo or Google -- because your stuff's flowing through no one company. And it would severely hamper the ability of a government like China or Iran to shut down services -- because you can no longer tie a particular service to a particular isolated range of IPs.

Of course, there are so many new challenges... but such is the nature of our evolving world order.

These folks with the project called Diaspora -- I think they get it. I don't care if their project is all smoke and mirrors and nothing ultimately comes of it. As far as I'm concerned they're bringing a necessary and oft-ignored discussion to the forefront. I welcome that.

Apple, Adobe and the Fight for an Open Web

There's been a lot of banter about Apple's refusal to support Adobe's Flash on its iWhatevers. Adobe weighed in unofficially and now Apple's Steve Jobs penned a letter.

Mr. Jobs gave the expected reasons for Apple's refusal to support Flash, and all were well articulated. I personally appreciate Apple's stance because I think Flash has to go, but simply find it ironic that Apple is pushing for an open web in this case. Mr. Jobs, although you skim over apps in your diatribe, we know the full story. Proprietary apps that only work on your devices do their fair share to stifle the open web. But I do understand that going this route allows you to offer a clean experience that's tightly coupled with the device right now, and I'd like to have faith that you will one day evolve this situation along with evolving web standards, and eventually abandon device-centric apps in favor of the open web. How and how quickly you do this will tell me a lot about your true interest and relevance in the open web you claim to espouse.

Many (including Adobe) would argue that Apple is unnecessarily being picky about the way things are built for their platform. Flash, they say, is a framework and language that has great tools and a great following so why not support it?

Well, for one, if you think Apple should simply give you choice with little other consideration, you're asking them not to be Apple. Apple is defined by their walled garden approach (see my previous post), making the experience better through thoughtful restrictions. Whether it's limiting their OS to their own hardware or deciding which apps you can run on your device, this approach arguably makes their products better and more accessible for many. Jobs articulated several reasons why Flash will impair the experience, both for the end-user and, they believe, for their developers. If we don't like it (I happen to agree), we can only hope for or create other choices. (And we have other choices with Android, thanks to Google.)

I think it's mostly a happy byproduct that Apple finds itself appearing to protect and support the open web, but they're right to "believe that all standards pertaining to the web should be open", and that is the most interesting part of this discussion. So what if Flash has great tools and a following? That alone does not determine what should be an accepted web standard. Microsoft also happened to have great tools and a following -- but ActiveX was a nightmare that nobody respectable would argue should be a web standard. Web standards are critical to the evolution of the web.

Open web standards are evolving (with HTML5 and beyond) to include native video, animation and 3D and it's imperative that this happens. Apple may have beef with Adobe and their own inane or justified reasons for not supporting Flash in the meantime, but they're in a better position to do this because the web development community knows that Flash must go in favor of web standards and so any uproar is therefore muted because Apple slyly appears to be working on their behalf by not supporting Flash.

Flash, in its current form, will go and must go. I agree with Steve Jobs (woa) that Adobe should focus on evolving their great tools to support web developers and designers who are creating video, animation and 3D for a web that is based on open standards.

The argument continues: but what if someone likes Ruby, Python, or C#? Apple is shooting themselves in the foot by being picky about the way things are built for their platform.

Well, then you'd have to say that the open web is shooting itself in the foot by being picky about the way things are built for its open platform. And that would be ridiculous. If you don't think there need to be web standards, you're on the wrong side.

Do you see what's going on here? Jobs has cleverly shifted the discussion about how open Apple should be to a discussion about the open web and what open really is. In other words, if the open web favors and adopts certain languages and methods in the pursuit of creating a "standard", how can you fault Apple for favoring and adopting certain languages and methods to maintain its standard?

Flash offers client-side web based features and functionality by way of its proprietary, ubiquitous plug-in. Really, the only thing similar is Java, specifically the Java plug-in (not the programming language), and iWhatevers and other devices don't support that either!

Sure, you can create an application in Ruby or Python or C#, heck you can create a desktop application in VB or C that connects to the internet, or an iPhone app in Objective-C that pulls in web content. But these are all outside of the web browser, and they are not web applications by accepted definition. There is nothing even remotely ubiquitous that allows for Ruby to run on the client (i.e. in the web browser), and the same is true of Python and others. The same is *not* true of Java or Flash -- both are unique in that the plug-ins (the runtimes) that allow for them to function on the client side (in a web browser) have been popularized and are ubiquitous. This fact means developers can in fact consider relying on Java or Flash (Flash being a far easier call than Java) for certain client-side features of a web application, and this is *not* true of Ruby or Python or C or Objective-C or others.

Sure, there's an obscure effort to allow for Ruby in the browser (which isn't really so as it relies on JavaScript and Flash given their ubiquity). And there are obscure efforts to force other languages in. But this demonstrates why we need web standards! If everyone tried shoving their language or framework in by way of some plug-in, then we have a very broken web! (How many "I don't have the plug-in" complaints do we see?) To see the web properly, we need web browsers that are all showing us (interpreting) the web in more or less the same way. And so to assuage the issue of one web browser not being able to see the same stuff as another browser, we have web standards guided by committee and community. (There are of course many more reasons.)

The web standard with respect to client-side functionality is centered around JavaScript, interacting with the DOM (Document Object Model, or the hooks into various elements of the page) and standards based features of the web browser (like scrollbars or browser size or LocalCache in HTML5).

Flash is not a web standard. Java is not a web standard. Web standards are necessarily evolving to incorporate what those plug-ins offer, natively. And that means we are choosing to move away from selective dependencies on external plug-ins in support of an integrated, native experience that every web browser can incorporate simply by following relatively light-weight web standards. We are agreeing on an open platform we can all implement and rely on, with no necessary dependency on proprietary or isolated components. With respect to where a web application must interact with external components, say the hard-ware specific accelerometer on an iPhone, there is a standards based way of approaching this where you can detect what capabilities exist on the client and write special conditions to handle that (the same way you'd detect resolution, or detect if you're on a mobile device, or detect if the browser has Google Gears installed for offline functionality).

Still the argument goes on: but allowing other languages and frameworks on your platform is a way to win over those developers and get new apps. Saying no to other great languages and runtimes is bad.

Well, is it bad for the web? I think not. In the world of the web, an open platform owned by no one company, the dynamics are totally different. There are standards for the client side part of the equation. What you do on the server side (or on the development side, i.e. which tools you use to produce the client-side code) is up to you because that's isolated to your environment and the world wide web doesn't need to care so long as you send over the stuff it needs to see and interact with.

Web standards exist to manage the client-side experience. Anybody, anywhere, on any device should be able to have a similar experience. And it's all working over a global network. Given the complex, delicate nature of this open platform, there must be web standards. If web standards did not exist, none of the implementors of the client-side experience (web browser makers, device makers) would know what to support, some technologies wouldn't play well with others, some may be too intensive given network and device constraints, developers wouldn't know how to spend their time, people would be seeing different versions of the web given different client-side implementations, and the whole damn thing would be broke! In this way, there is no comparison to pre-web platforms like Windows.

Now, while Apple can point to the open web to help justify Apple's refusal to support Flash, the correlation is limited.

For one, web standards are thankfully determined in a pretty practical way. Sure, there's the W3C which is comprised of anybody that wants to be a part, and there are community discussions and decisions, but there are also simple market forces at play -- i.e. if HTML5 catches on and there is widespread acceptance among the development community of it as a standard, then implementors (i.e. web browsers) can support it, even before it's ratified as a standard. When there's enough support, developers can leverage these features in the client-side experiences they're building. And the web moves forward.

And web standards, in fact, do have a standard way to support non-standard components -- via a plug-in architecture and things like the object tag. The iWhatevers, on the other hand, do not. But Apple can refuse those non-standard components and still claim it's supporting an open web. As for *why* it would eschew Flash or other languages and frameworks at potentially great expense (e.g. sites don't work and not as many developers are attracted), refer back to the many other reasons Jobs described including the unsaid one that defines Apple: walled gardens make for better experiences in their estimation.

I welcome greater adoption of HTML5 and web standards, from Apple or whomever. I want Flash to go. But Mr. Jobs, if you are rallying behind the open web and expect us to forgive Apple's closed nature and forced restrictions because of it, get out from behind the veil and get fully on board. You may not be able to open up everything right away because of important factors like experience, and I respect that, but when legitimate ways to open up further present themselves (for example web standards evolve to match the speed and device integration of native iWhatever apps), make the right moves. I believe your company's future depends on it.

To an open web.