Infrastructure

10:25 AM
50%
50%

How to Assess Your Firm's IT Productivity

Capital markets technology organizations are under extreme pressure to do more with less. Rubin Worldwide founder Howard Rubin offers firms some guidelines for measuring IT productivity.

There should be no question that companies are under pressure to increase IT productivity. In today's economic climate, the technology economies of business and government organizations must produce "more for less," "far more for the same" or some variation of these themes.

The financial services sector -- and even more so, the Street -- is the poster child for this phenomenon. The need for computing power is rising faster than revenue is growing (in the few firms where such growth is apparent), and the same pressures are actually amplified in firms in which revenue is stagnant or decreasing. Plus, there is the ever-present "need for speed" in applications development and infrastructure deployment. Combine all these forces and it is clear that information technology organizations must become more productive. But what the heck does that mean, and how would we know if it actually was achieved?

It sure would be great if there was a way to measure IT productivity. But there doesn't seem to be a useful metric out there.

IT Productivity Measurement Approaches

Defining an accurate (and universally agreed-to) measure of information technology productivity is perhaps the "holy grail" of IT measurement. Going back 40 to 50 years or more, there have been many approaches -- perhaps more accurately, "attempts" -- at measuring the productivity of selected aspects of IT.

Among the most famous/notorious are those targeted at software development productivity and programmer productivity, such as function points delivered or maintained per person (per month or per year). Other measures focus on infrastructure support productivity, such as servers per system administrator (SA) or desktops per SA.

On their own, none of these candidate measures, however, can answer the question: "Is our IT productivity increasing?" or even "How does our IT productivity compare with our key competitors' productivity?" Developing an approach to answering these frequently asked questions is best done in the context of going back to the roots of productivity itself.

Productivity Defined

The measurement of productivity classically focuses on ratios of output to related input. For example, one of the most visible measures of the U.S. economy is output per worker (or economic output per worker hour). An increase in the value of this measure is used as an indicator of increased national productivity.

In the context of information technology (IT), however, there is no single equivalent measure. As noted above, IT productivity is typically assessed in terms of economic efficiency (related to unit costs of key IT services) and support ratios (business or technology volumes in relation to IT services or staffing levels).

In addition, IT productivity also is often measured in the context of the outcome of using automation to provide operating leverage. Therefore, ratios of operating expense to IT expense also play a role in gauging IT productivity, but more so in a time series model than as a single snapshot. If an organization is obtaining operative leverage from IT, it would be expected that if IT costs are increasing but contributing to operating leverage, then operating expense would be declining (or increasing at a slower rate) -- hence IT expense as a percent of operating expense might increase while the year-over-year percent increase in operating expense would decrease.

Finally, from yet another vantage point, the productivity of IT often is viewed in the context of the growth rate of key business transactions, volumes or activities versus the change in IT expense. If an organization's business volumes are declining, one might expect IT expense to move in a similar direction -- though there are many circumstances in which this would not be the case.

Overall, it is the pattern of change in all of these dimensions that is critical to an assessment of IT productivity. And benchmarks, if applied concurrently, can be used as a basis for comparing performance against an external reference point (where relevant benchmarks are available).

The IT Productivity Pattern

Forget the idea of a single IT productivity measure for a few moments. Consider the concepts (and components) of a pattern that would be an indicator of changing IT productivity. The pattern would simultaneously consider the organization's leverage of Moore's Law, its efficiency of delivering/supporting applications and infrastructure, effective alignment of IT resources to business(es), the focus of IT on activities to grow the business, IT economic agility, and clear evidence of IT contribution to business outcomes. If this composite indicator (the output) rises faster than IT costs and resources (the input) then it is likely that an assertion that IT productivity is increasing would be correct.

Meanwhile, though a firm may be able to leverage Moore's Law to drive down the unit cost per processing cycle and increase its internal efficiency -- as measured by support ratios and things such as function points per programmer -- IT resources may be increasing even as business volumes decrease. So while some of the "spot measures" of IT productivity would provide a false positive (indicating higher productivity), the overall reading would be that IT productivity is not increasing in any meaningful way. The pattern tells the story, not the individual measures (see sidebar, next page).

Dr. Howard A. Rubin is a Professor Emeritus of Computer Science at Hunter College of the City University of New York, a MIT CISR Research Affiliate, a Gartner Senior Advisor, and a former Nolan Norton Research Fellow. He is the founder and CEO of Rubin Worldwide. Dr. Rubin is ... View Full Bio
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
More Commentary
Scotland Independence Vote: Haggis & Fragmentation
Scottish independence has far-reaching consequences for the global financial markets.
5 Tips to Save the Wall Street Datacenter
Though cloud computing and SaaS are all the rage, there is still a need for proprietary Wall Street datacenters, as long as they are run efficiently.
Preventive Measures for Post-Interview Anxiety
Most professionals leave interviews thinking that it went well, and then they wait... and wait. The Caring Recruiter has a cure for the typical post-interview trauma.
Leaving Out the Welcome Mat for Financial Services Hackers
Everyone knows the financial services industry is a prime target for hackers. Despite the dangers, many applications have software vulnerabilities that expose real risks.
4 Surprising Ways Firms Think About Data Security Costs
Almost 28% of firms are willing to bear the cost of some financial losses due to cybercrime, because it's less than the cost of upgrading IT systems.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
7 Unusual Behaviors That Indicate Security Breaches
7 Unusual Behaviors That Indicate Security Breaches
Breaches create outliers. Identifying anomalous activity can help keep firms in compliance and out of the headlines.