When the first commercial computers came out in the 60s, they were so expensive that time-sharing was invented: people had to queue to get time on them (read accounts of those days: Hackers: Heroes of the Computer Revolution or Hard Drive: Bill Gates and the Making of the Microsoft Empire). In the 70s, minicomputers allowed for a larger distribution on computing power, opening the era of client-server computing. The 80s were then ready for the Personal Computer era, where most semi-basic work (spreadsheet, accounting, word processing, desktop publishing, etc.) could be done locally.
That paradigm lasted until the mid-90s when the world-wide-web arrived on the consumer scene; and we all started putting back processing power in a central computer, the web server associated with a site. Local computers tended to behave again like a dumb terminal (à la VT100) with a browser. But users still did lots of stuff locally (email, photography, video editing, etc.).
10-15 years later (around 2004 with Flickr's API launch), the paradigm changed again. Browsers got intelligent (with AJAX and HTML5 soon), enabling local processing power and more importantly, mashups (Wired magazine had a very interesting article covering this trend very early on - maybe I'll find it), allowing (either on client-side or server-side) mixing of several sites/services, to create a new service. The central computer (company) is not one anymore, but a number of them. Hence the term cloud-computing was minted to described this new "central computer" consisting of many "central computers" (actually in the tens of thousands now, to compensate for scalability. Check slideshare for horror stories).
There's an interesting way to look at all this, analogous to the MVC model in software engineering:
- the presentation layer : what you see. Basically it's now your browser and may of your "web 2.0" applications. But with HTML5, it's soon going to be your operating system as well (Jolicloud is doing this for example). It's like a super-charged VT100, capable of rich user interfaces. Processing power there should be limited to displaying intelligently whatever information you're looing. It's already "cloud" enabled with native Internet protocol
- the processing layer : see the history above. We started with mainframes, moved down to computers, moving up to "cloud" servers on the web. There's no real reason now to buy computing power for your desktop or laptop, since MOST of your processing is done on the web now. So unless you have to do video rendering or serious gaming (but Bigpoint could prove you wrong), get a cheap computer (on the CPU side, ultra-fast on disk and memory - but that's another post)
- so we're left with the data layer. If we assume that computers have lost their value, then there's a whole new space here in moving the value to the data:
- you might want your data all the time with you, WHATEVER the computer you are using, which means you need access or a copy of your data + your environment (settings) + your apps (bookmarks, passwords, or even installable apps) to read the data. Many computers are somehow addressing this : Dropbox, Allmyapps, exoplatform, AgileWeb... It's a complicated market, as most apps are not setup to host their settings, and data on a remote drive. And it's complicated to do for each app. So either you set up all locally, or all remotely... in terms of simplicity #fail
- In addition, a lot of your experience is done online with no local copy, nor backup (think blogs, facebook, twitter...). Since you're relying more and more on this, companies such as Backupify or LaCie are helping you access but also secure this data. The problem is that users don't understand the value of backups, unless they experience a disaster once. #fail
- since the world has gone quadruple-play (mobile, web, TV, connected device), you want to have access to your data with an appropriate "presentation layer" on each screen, wherever you are. Folks are attempting this: Dropbox is doing a great job, twitter and gmail are available almost anywhere, Pogoplug has a solution, but we're not quite there yet... #fail ;)
Hence, to me, the real challenge of cloud computing now is not the processing layer: that was solved with the evolution of the web. It's still the presentation layer for many devices (but many folks are working on that now). It's all about the data layer. It's how do I get a seamless computing experience, whatever my device (even if it's not mine), wherever I am, today and tomorrow ?
(Pix from Flickr)