Wall Street & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Careers

10:06 AM
a contributed article by Adrian Sharp, CSC
a contributed article by Adrian Sharp, CSC
News
Connect Directly
RSS
E-Mail
50%
50%

CSC Perspective: Delivering Value through Business Process and Data

The industry needs to focus on more effective straight-through processing before it is in a position to fully evaluate the conversion from T+3 to T+1.

"...the industry needs to focus on more effective straight-through processing before it is in a position to fully evaluate the conversion from T+3 to T+1". SIA's STP Connections, July 18, 2002

Let's Talk about the REAL Issues
CSC released its perspective on the SIA's T+1 decision early in July, cautioning against the deferral of straight-through processing (STP) efforts in light of the anticipated conversion delay. We believe the SIA's decision to postpone the movement to T+1 is based upon sound reasoning given the events subsequent to September 11th and changes in underlying market conditions. In addition to industry-wide benefits associated with an efficient STP infrastructure, the benefits that accrue to individual firms are significant. Not the least among these is a long-term reduction in IT overhead, reduced implementation costs and faster time to market for new initiatives and, a reduction in operational risk. Two of the core elements of a sound STP program involve solving reference data (data aggregation ) problems and evolving from a functionally siloed application architecture into a business process-oriented architecture. Business process, in this context, denotes an end-to-end series of activities organized to deliver customer and business value.

Pending Non-discretionary Industry Events Provide Context
The first and most immediate of these is the U.S. Patriot Act. The law, which was instituted in October 2001, requires all broker/dealers to comply with the new S.A.R. (Suspicious Activity Reporting) Rule by January 2003. To comply with the new rule a monitoring system must be in place to detect 'unusual behavior'. There are numerous challenges here, not the least of which is establishing a reference database of customer behaviors considered to be 'usual'. One of the recurring themes CSC encounters in working with clients is the issue of customer data being distributed across functional silos. Adding to the complexity, certain key reference data may be contradictory (if duplicated) or specific S.A.R. data may have simply never been recorded at all.

Somewhat further out in time (2006) -- yet no less significant for the industry (banks in particular) -- are the implications of the Basel II Capital Accord. The Accord will require that a more sensitive risk management methodology be applied in determining capital requirements to cover credit risk and, for the first time, the industry will be required to make capital provisions for operational risk. Outlined in the Accord are three methods, of increasing sophistication, for calculating capital requirements. The adoption of the more sophisticated method can significantly reduce the calculated capital requirement. The capital markets industry measures its performance in terms of 'Return On Capital' (ROC); hence, the implications for the industry are significant. Once again, a comprehensive database of historical data will be required to support a quantitative basis for the determination of risk and capital requirement. Today, much of this data is likely spread across multiple databases in multiple formats. Additionally, key credit risk data may not exist in readily accessible transaction form at all. Certain critical reference data may be embedded in the text of deal specific bilateral credit agreements or, at best, recorded in proprietary spreadsheets. Implementing an operational risk management system will be challenging since there are very few firms recording any structured historical data of operational related losses to support a quantitative method -- and the methodology itself is still in the developmental stages.

The solution for Patriot Act and subsequent Basel II requirements must address the reference data and aggregation issues at a fundamental level.

The alternative is to exacerbate the existing problem by creating point solutions and additional application specific data images, each of which has a non-linear impact on downstream complexity and maintenance overhead.

The focus must shift from Function to Business Process
Historically, commercial software development practices focused on the automation of manual processes. It is therefore natural that the great majority of legacy systems are strongly mapped to functional boundaries ("P&S", "Cashiering", "Trading", "Portfolio Management" etc.). In many cases, separate instances evolved to cover different instrument classes and or geographies. Significant consolidation and M&A activity has added to technology portfolio complexity.

It is important to recognize that the terms 'Function' and 'Process' were used interchangeably when legacy systems were developed in the 1970s and 80s. This is fundamentally different from the concept of an 'Enterprise Business Process', which emerged with the Re-engineering discipline of the early 1990s.

In recent years, with the increase in commoditization and margin pressure on traditional lines of business, great emphasis has been placed on technology as an enabler of delivering highly differentiated value propositions. Many encouraging technologies were developed during the 90s (e.g. Message Oriented Middleware), which offered the potential for collaboration at the application level and some measure of STP capability. Those firms who had the luxury of starting with a clean sheet in the late 90s (B2Bs and B2Cs) were able to fully exploit these newer technologies. For the majority of bricks and mortar firms however, the results were mixed. Their core legacy applications remain functionally stove-piped and were never designed for inter-process communication. Even today, for many firms, satisfying a simple customer request such as a consolidated global statement exceeds their capabilities within a reasonable response time.

Legacy systems are here to stay for the foreseeable future. Wholesale replacement is not an option, particularly in the current economic climate. It is therefore imperative that firms undertake a program whereby business processes can be externalized and updated whilst maintaining the underlying legacy infrastructure. CSC Recommendations

First, ensure that business processes are fully understood (i.e. in their contribution to business value), documented and kept up-to-date. The emerging Business Process Management (BPM) technologies offer some potential for effectively abstracting core processes and associated business rules from underlying legacy systems and repackaging them into a set of re-usable service components. This task can and should be undertaken incrementally. The technologies to support this approach are in their early stages and big bang approaches are neither desirable nor necessary.

Secondly, aggressively dealing with the data-aggregation issues must be undertaken now rather than later. The ability to satisfy new regulatory requirements and meet increasingly demanding customer expectations make this a critical issue. The emergence of industry utilities for the delivery of non-proprietary reference data offers an alternative to wholesale internal replacement (e.g. ACDEX and Reuters/Synetix ).

There are currently two approaches to the data aggregation problem; one is through IPC (inter-process communications) where an independent state keeper subscribes to all traffic and assembles and aggregates reference data, or 'data mart' solutions that are run as batch background jobs against multiple sources or data warehouses to maintain currency of summarized data. For highly distributed environments, the state-keeper should provide services around current transactional data, and 'data marts' (smaller data warehouses geared towards operational uses) around historical and consolidated data. At this time, the technology doesn't really support real-time access to and aggregation of data from its sources (data remains still, process performs collection), but requires intermediary consolidations. In the future, data integrity will be maintained not by replicating, but rather by being available through service requests.

Finally, as budgets for the coming year are being prepared the industry will consider its priorities and the associated buy, build and outsource options. There will be the inevitable pressure to deploy 'apparent' quick fix or 'lowest cost' solutions. History has consistently shown that this approach can is a deceptively false economy for which the price is always paid in terms of increased downstream operating overhead and cost of change. ROI analyses must therefore be truly rigorous and go beyond simple unit cost comparisons.

Register for Wall Street & Technology Newsletters
Video
Exclusive: Inside the GETCO Execution Services Trading Floor
Exclusive: Inside the GETCO Execution Services Trading Floor
Advanced Trading takes you on an exclusive tour of the New York trading floor of GETCO Execution Services, the solutions arm of GETCO.