Thursday, December 20, 2007

Future of IT: data centres and web-based development tools

Continuing my series on the future of IT, I originally planned to write two more posts, one on the future of data centers, and the other on the future of application development. But they quickly become the same article, because I predict that the two will come together.

Current application development tools, like Eclipse or Visual Studio, cover only a small part of the end-to-end process. What about business requirements, analysis, design, configuration management, testing, collaboration, release management, bug tracking, and hosting? It's spread across a huge range of systems that don't integrate effectively.

It's been pointed out before, but probably the least automated process I can think of is application development. Imagine a website that stored your code and allowed you to edit it, compile it, manage software configurations and releases, maintain bug databases and create test environments on the fly, design user interfaces using WYSIWYG, and hosted the resulting application. Would you really go back to Visual Studio?

Following the Amazon web services model, developers won't need to know anything about the underlying hardware - they would just see their memory, CPU, network bandwidth and storage usage, and be charged appropriately for each. The data center is totally behind the scenes and provided by the development tool vendor.

Which brings me to the data center. In the last 18 months, the data center of the future has become very clear, and all the major vendors are rushing to deliver what can be described in one word: virtualisation. Instead of building separate storage, database, and processing environments for each application, why not just build a farm for each, to be used by any application as it needs it? Simply add capacity every month by plugging in a few more servers, based on demand. Then any application can use it on demand.

Call it data center 2.0 - the hardware has become totally commoditised, and the value has shifted to the management tools that plug everything together, which is why vendors like HP (with OpenView) and IBM (Tivoli) have been investing so heavily.

Data center 3.0 is what happens when the management tools become commoditised too, as they surely will very quickly. Only Amazon, with its web services, is really positioning for this. Data center 3.0 is when the value shifts to the only place left - the development tools. Data center 3.0 is when developers outsource their data centers.

Nicholas Carr is describing it in his book, The Big Switch. There will only be three or four major companies with public data centers globally - HP & IBM, plus a few. Each will invest tens of billions of dollars in computing equipment, and between them, they'll host most applications in the world.

In addition, a small number of massive corporations will maintain their own private data centers, in an effort to maintain a competitive edge. That will include the major investment banks, plus internet companies like Google. No one else will be able to compete with the shear investment and scale required.

The benefits for developers will be huge. Logging into developer.hp.com and selecting your language of choice, having compilation and debugging reports done for you, test environments created on the fly via a web interface, a bug database linked to an automatically populated software configuration management tool, while not even have to worry about the data center - what an advantage.

For corporations, the model offers a way to avoid having to manage complex and expensive data centers, and avoid capital expense in favour of monthly bills that scale with use.

Traditionally, data centers have always existed for application hosting. In the future, they will be for application management - not just hosting, but development, problem, configuration & capacity management. That's where the value is. Development tools will become the developer's front end to data center 3.0.