October 07, 2013

CIOs may not be aware of the total cost of technology, but technology intensity is a fact of life in corporations and demand is outstripping the ability to invest in innovation.

That message was delivered at the Interop New York 2013 Information Week CIO Summit last week, where two speakers talked about the importance of understanding technology spending and how the cloud can alter the economics of innovation.

“Demand for technology is increasing faster than companies can grow revenue, faster than personal income is increasing and faster than profitability is increasing,” said Dr. Howard Rubin, president and CEO of Rubin Worldwide at Interop’s Information Week CIO Summitlast Wednesday.

In a session entitled “The Key to Innovation is Determining Your Real IT Costs,” Dr. Howard Rubin, founder of Rubin Worldwide, said technology consumption is increasing at a faster rate than the world’s GDP. What’s more, the amount of money spent on technology exceeds the IT budget. The dilemma for CIOs is that they often don’t understand the total cost of technology, asserted Rubin who offered data to support his points.

From 1980 to 2010, the global GDP had doubled while technology spending went from $300 billion across the world to $5 trillion a year. “Technology [spending] had gone up by a level of 14, just to keep the lights running with some innovation in there but at the same time, the GDP barely doubled,” said Rubin. In fact, Rubin, a pioneer in the field of Technology Economics, said if IT were a nation, it would represent the fourth largest GDP on the planet.

The average company spends on the order of 4%-to-5% of their money on technology, said Rubin. But the amount spent on transforming the business is 15% of the 5%, which means the average company is now spending 0.75% of revenue on innovation technology, which is nothing, said Rubim. On top of that, if a company undergoes a revenue dip, it’s often asked to cut discretionary expenses, which are where the transformation occurs.

Moore's Law Breaks Down In the past Moore’s Law, saved companies since their processing power doubled every 12 to 18 months., this is no longer the case. A tipping point was reached in 2010-2012 when the data from Gartner Group tracking 3,000 companies in 21 sectors, shows that the growth rate of storage increased 45%, while the number of servers, bandwidth, MIPs, rose above 20 %.

Today people are investing in Big Data, they’re doing more customer analysis, they’re investing in mobile, and they’re pushing out more products to offset the technology costs. While firms have become more productive, improving the productivity of resources increases consumption, said Rubin.

[Outgrowing Your Data Centers? Look to the Cloud for Relief .]

Meanwhile, the growth of technology is uncharted. Going back to 2005 the average corporation in the U.S. was spending about $10,000 on IT per employee. By 2010 the average company was spending $12,000 per employee. By looking at total technology spend, from 2010 to 2012 Ruben estimates that companies were spending $94,000 on technology per employee.

He admits that few companies think about technology economics, since it is a relatively new field. Technology expense is growing at a rate of 7 to 10 times faster than global GDP so that intensity is rising, said Rubin.

“In pursuing the real cost of IT in companies you have to take a total technology expense view,” says Rubin. Growth of technology is in the front lines of the business — marketing, it’s in engineering and for record companies it’s in iTunes, he said. “What technology economics tells us is that as we get better with technology, the demand for technology increases,” said Rubin. With conditions in the global economy we can’t feed that thirst,” added Rubin. Calling the current model “unsustainable,” Rubin advised companies to seek new approaches to technologies, and that’s where cloud and virtualization and the free thinking of technology itself, come in.