Size matters. Most people will say that bigger is better, and this is especially true when it comes to data centers. Often, size translates to economies of scale and lower overall costs.
For large web or consumer-focused companies -- think Facebook, eBay and Amazon -- running enormous, proprietary data centers makes a lot of sense. Facebook actually builds from scratch many of its servers to optimize performance. Some of the social-media giant's newer data center facilities are monuments to green technology, scale and efficiency.
In the financial services space, however, the strategy of big, proprietary data centers may be coming to an end. True, firms -- especially in the capital markets -- can gain competitive advantages by running applications in specialized data centers that provde proximity and ultralow latency. Likewise, financial firms are highly regulated and thus highly concerned with data security, which makes running their own data centers an ideal choice. But as the business and technology change, many firms are beginning to question how many applications actually need the white glove, extremely fast, high-end, ultrasecure data center treatment.
As it turns out, the list of applications that need the aforementioned handling gets smaller by the day. Some applications -- such as HR and benefits systems, certain CRM applications, and a variety of back-office processes -- probably never really needed the ultrareliable, ultrafast service they have been receiving.
"If you look at the typical financial services firm application portfolio, approximately 30 percent of the applications are mission-critical and regulatory sensitive, while 70 percent are HR, basic financials, talent management or other things," asserts Tony Bishop, a former managing director with data center responsibilities at Morgan Stanley. "They don't need to have a resilience at the highest level."
Unfortunately, Bishop notes, most firms have not even categorized their application library. "Most firms are challenged to even come up with a list of the thousands of applications they have and identify exactly where they are running," he contends.
Changes in the economy and the capital markets landscape, however, are forcing firms to rethink the ways applications are supported throughout the enterprise. When times were flush (before the financial crisis and briefly during the bull run of 2009), firms tended to generalize and build all of their data centers with the latest technologies and hardware. Facilities were a data-center geek's dream: brand-new, the latest technology, massive, and future-built.
Today, financial firms face a host of pressures, among them regulatory mandates that firms carry higher capital reserves to offset risks and provide a buttress during times of severe financial stress. In addition, firms have been forced by upcoming regulations to divest certain business units (think: proprietary trading). Meanwhile, the traditional revenue source for many capital markets firms -- trading -- has evaporated.
Too Much Tech Spending?
This drives business leaders to focus on squeezing out costs and looking for internal efficiencies. And unfortunately, IT spending by banks, in general, is much higher than in other industries. In fact, some experts believe it to be unsustainable. According to a study from Boston Consulting Group, the banking industry has 14.3 percent of total costs tied up in IT. All other industries average out at 7 percent, with insurance and telecom companies focusing about 10 percent of total expenditures on IT. For comparison, companies in the energy or industrial goods sectors allocate less than 3 percent of spend to technology.
"The cost of IT in financial services is too much," says Colin Kerr, technology specialist, worldwide financial services, at Microsoft. "The economics are becoming unsustainable. Banking is an IT-intensive industry, but when you look at it this way, [banking leaders] have to start making some decisions."
Couple the cost pressures with the explosion in data, and financial firms have a challenge. "There is a perfect storm from the explosion of [mobile] devices, stats and data, the increased costs of running a data center and the increased regulations," Bishop says. "You can put your head in the sand and ignore it, but many firms need to change and integrate their data centers so they can support the business."
They have to look at the entire picture, Bishop adds. "If a bank is pumping $100 million or more a year into a data center, that isn't efficient," he says. "It is slowly killing them."
Harish Rao, VP and chief technology officer for global infrastructure services at Capgemini, reports that many of his banking clients are struggling with similar cost pressures. "There is a sensitivity to maintaining the same level of capital spending on data centers, given the current state of the business," Rao says. "In the past, there was a lot of money allocated to the data centers, but that capital has become difficult to justify.
"When companies look at opex [operating expenditures] versus capex [capital expenditures], it is easier to justify opex," Rao adds. "They are tempted by the ease of budgeting and decisions with opex."
Given the massive investments already sunk into data centers, a bank's traditional reliance on home-grown data-center capabilities and a strong internal culture, however, IT leaders are finding it difficult to change course. "Many banks are finding it hard to make these decisions," Rao says. "Their data center structure is strong, but what is the next step to take?"
Fortunately, attitudes toward vendor-hosted applications (sofware as a service, or SaaS) -- as well as platform as a service (PaaS) and infrastructure as a service (IaaS) -- and the cloud are softening. Perhaps the technology is getting better, the economic pressures are becoming too great, or both, but many firms are considering "all of the above" as they look to shift their data center strategies and costs.
"Technology is more robust and it can handle large workloads," according to Jeremy Sherwood, product manager, virtualization and cloud, for ScienceLogic, a provider of data center and cloud management technology. "Some of the older hardware is on its way out, and you absolutely have more capabilities with the new processors, with multicore and with hyper threading capabilities.
"On the software side, the virtualization has gotten a lot better," Sherwood continues. "And hypervisors have gotten more robust. The combo of the chip, the hypervisor and the software getting better is making alternative forms of hosting applications very attractive."
Greg MacSweeney is editorial director of InformationWeek Financial Services, whose brands include Wall Street & Technology, Bank Systems & Technology, Advanced Trading, and Insurance & Technology. View Full Bio