Plugging a monitor and keyboard into the wall and having not only a rich user interface but unlimited power at your fingertips is a wonderful concept. The idea of abstracting the complexity of the computing/data center infrastructure from the computational process goes back to the time when mainframes ruled the land; users simply flicked on the tube and applications were just there. But that was a long time ago and seems like a galaxy far, far away.
Today, users have been empowered with greater computational power and increased flexibility. Unfortunately, this has been accompanied by increasing technology and application complexity.
The challenges of managing today's computational infrastructure are not getting easier. Financial markets firms now are adopting Linux-based clustering technology rapidly and extending their mainframe, Solaris and Windows-based environments. In addition to their heterogeneous technology infrastructure, they are running a mixed operating environment as service-based architected environments are now layered upon traditional platforms.
To contain this infrastructure, firms are consolidating technology within the data center. But how do data center personnel manage the infrastructure? Current data center management technologies do a good job at discovering and monitoring technology assets, but they don't fix problems automatically, or reprovision and re-route stalled technology onto an available platform. This is where utility computing comes in.
While the utility computing hype is about the acquisition of computer cycles like electrical power and paying for only what is used, most financial markets firms are looking at this differently. First, the technology needed to manage CPU cycles like electricity doesn't exist. Second, large firms, while wanting to off-load a few processing jobs here and there, rarely want to outsource their whole technology infrastructure. What firms do want, and what utility computing can provide, is the ability to manage data centers holistically and more automatically.
Financial markets firms want to create a virtual computing utility within the firm - not outside it. This enables firms to secure and discover technology assets, monitor their health, determine and implement service-level policies that match application workload with available resources, manage and virtualize the data and computing infrastructures, prepare applications for deployment, allocate the application to the proper resources, provision the route and schedule the workloads, meter the usage, and provide an automated method to manage the health and well-being of the computing infrastructure.
This is the ultimate flexible data center. The only problem is that the infrastructure isn't here yet. Many of the parts exist, but they are offered by start-ups (i.e., they're not integrated) and in many cases are not available for data center-type scale. But there is hope.
Many of these start-ups are on the way to delivering off-the-shelf facilities for bulge-bracket firms. Concurrently, larger technology firms are investing in relationships, helping the smaller innovators to develop, create, market, scale and implement their products. The larger vendors are also fostering the "on demand" culture, spreading the idea that financial institutions don't need a hosted data center or that heavy-duty computational power can be acquired for as little as $1 per CPU per hour.
While these large technology providers are unable to support the true utility concept currently, they are headed in that direction. And while financial markets firms currently are not interested in obtaining all their computing cycles from a plug in the wall, they are interested in making their data centers more robust, efficient and automated as they drive utility computing toward reality.
Larry Tabb is founder and CEO of Westborough, Mass.-based The Tabb Group, a financial-markets strategic-advisory firm. firstname.lastname@example.org