Thinking about the future of computer applications

One of the great points of discussion these days is the movement of applications to the web. Some are suggesting that there will become a time when the web browser is the new operating system. Om Malik talks about this at CNN Money. The idea is that web browsers would be the platform that applications are built for. The underlying operating system between the web browser and the hardware would become irrelevant. Data would be stored somewhere on "the Internet cloud." This would allow all sorts of wonderful things like access to applications and data everywhere and on any device.

It's an intriguing idea but I'm not sure I buy it as the future. Don Dodge comments on some of the issues.

The problems with web based applications have been; reliable Internet connections, fast bandwidth, and off-line operation. How many demos have you seen fail because the Internet connection went down, or the bandwidth was so slow it was painful to watch? It is getting better, but off-line use remains a problem.

He also talks about how he sees the future. The computer industry is moving towards web services. The problem of offline usage is still an open one but we're moving towards solving it. Other issues of particular interest to enterprise users ("data security, advertising distractions, and quality of service") are also being worked on. These are all very interesting topics for discussion.

An other topic of discussion is the idea of what this new model does to companies that rely on proprietary formats. Om Malik says:

If you're a developer or startup, you are suddenly free to write a browser-based application and quit worrying about which operating system, chip, or device your consumers are using.

It's a scary thought for anyone who built a business around proprietary formats. But for the end user, this is the kind of future that Andreessen on his best days - and maybe Gates on his worst - had envisioned.

This is an idea I have to think about. Off the top of my head I don't see web services and web applications as being incompatible with proprietary formats. Writing a web application that can be used by all web browsers would have to use some standard formats to be sure. But the Internet is not just one protocol.  For sure some people will choose to develop their applications so that only standard formats are used and a web browser is the only tool needed locally.

This is the easy way in some ways but there are costs to it as well. It may be that not everyone is willing to pay those costs and that some developers will use custom protocols and applications that are not standard parts of web browsers or perhaps even independent of web browsers to get the performance that their customers demand.

People tend to get focused on one tool or paradigm. For a lot of people it is desktop applications. Lately it seems that some see web browsers as the answer to everything.  Some people see a mix of applications and web applications with, perhaps, web services tying the two together. Ray Ozzie talks about the "Client/Server/Services" continuum. There is room, I think, for proprietary "stuff" in all of these solutions. And I for one don't think we're done seeing new Internet protocols and applications that use them.

One piece that is missing in a lot of these discussions is "who is going to pay for all of this?" Somehow developers have to get the funds to keep a roof over their heads and food in their bellies. Unless they are going to work some other, non-development job to earn a living and donate their free time to develop applications for enterprise users someone is going to be writing code for money. Who is going to pay for that development? IBM and other companies make money doing custom work to connect and enhance software that doesn't meet needs out of the box. Microsoft and many other companies make software that is designed to work out of the box and meet enough of a user's needs to be worth the money. Both models require people pay for software development. Some user always pays something. Perhaps some ride along for free but only if someone else is paying their way.

So where are applications going? Is it a waste of time to teach students, especially high school students who are potentially 4 to 8 years away from professional development, how to create desktop applications? Is web development the only way to do? Should we be focusing more on web services? Does anyone in high school computer science even talk about client/server applications? I wonder how much of that is taught in college for that matter.

We can't teach everything but we can talk about a lot more than we actually teach students to implement. Are you discussing the future of software development with your students? What are you saying? More importantly where do your students think software is going? Are they thinking outside the box or just parroting what they hear or read on the Internet? We're going to need some innovative ideas to progress into the real future.