Data Management

09:00 AM
Allan Grody
Allan Grody
Commentary
Connect Directly
LinkedIn
RSS
E-Mail
100%
0%

STP: Half a Century of Unfulfilled Promises

Having gone so far with processing efficiency as the lead in the mantra of the STP vision, the industry is now seeking global standardization solutions to STP under the new mantra of mitigating global systemic risk.

There was a time when the vision for Straight-through-Processing (STP) was all about the “locked-in-trade,” an idea spawned back when equity trading floors were of the human kind and the communications vehicle was voice over telephone. If we could only get a trade into the new devices connecting teletype machines and TV-like screens, into the hands of a broker, onto an exchange floor, and back again -- all within a computerized work flow -- it would be “locked in” so computers could process the transaction, presumably error free.

As this materialized, in the form of data terminals connected to high-speed telephone networks connected through switching and storage computers. It enabled firms to send, store, and place an order onto an exchange, then retrieve the executed order, match it to the sent order, and send it on to its origination point. Locked-in-trade for sure, but still insufficient for STP.

STP matures
We learned quickly that there was a next step to STP, settling the trade -- transferring ownership and getting paid. So we moved from bilateral payment and settlement to multilateral netting and novational methods. This allowed the industry to build efficient computerized clearing mechanisms and depositories in equity and bond markets and also to computerize the clearing houses and central counterparties in contract markets. We did the same for our sovereign debt markets. Now surely the STP vision was fulfilled. No, not quite.

We then built post-trade matching and allocation facilities, so that groups of exchange and dealer trades, originated from multiple money managers, and destined for a myriad of custodians, could be processed efficiently by computer. We recognized the burgeoning asset securitization markets and successfully fit them into our ever more complex, yet evolving, computerized attempts at a STP environment. We dealt with mortgage pass-through securities with their pools and pre-payment speeds, and later their risk tranches and fit them in as well. We dealt with the extended supply chain of investment managers, hedge funds and later high-frequency traders, who were moving off of telephones and faxes, and needing computerized straight-through-processing access. We plugged them in through the ubiquitous Internet and latency-busting fiber networks.

Other markets and the STP mantra
We began to see listed contract markets evolving multi-manager pooled accounts and limited partnership structures, similar to the evolving collective vehicles in the capital markets. These too needed post-trade matching and allocation processes so that the myriad of administrators validating trades and preparing statements and K-1s could do their jobs more efficiently.

Later the industry invented OTC derivatives and tried to fit them into the ever more complex, evolving, computerized, not-yet-completed STP environment. This didn’t quite work, since there were one-off trades done between two counterparties, a throwback to the days when most transaction were entered into on a one-to-one (bilateral, to use industry jargon) basis. Later innovations had another component -- a reference entity involved, as in CDSs, creating three way relationships and tri-party agreements -- not to mention the retro method of physical contracts, even though they evolved in streamlined fashion as Master Agreements.

It got really complex when all these products -- bonds, mortgage backed securities, securitized loans, and asset backed securities of all types got aggregated into higher order derivatives, the infamous CDOs, with all kinds of risk appetites to choose from. These CDOs were again aggregated into CDO Squared securities with the mind-numbing possibility of a fiduciary who wanted to do proper due diligence having to read through a billion pages of offering memoranda and prospectuses.

Regulation swept in after the financial crisis of 2007-2008 and foisted new infrastructure entities on the OTC Derivatives markets: market data displays of quotes, sales prices, and volume; swaps electronic execution trading platforms (SEFs); central counterparties (CCPs); and swaps data repositories (SDRs).

Data standardization and the STP mantra
Along the way we recognized that we should standardize the way we send data, to help foster computerized STP pathways. So we used evolving technologies and invented data tagging conventions like FixXML, FpML, and XBRL. This allowed us the ability to move one step back in the STP pathway, to originators of documents and data, thinking that by standardizing message formats and surrounding the basic elements of a financial transaction with data tags, computers could talk seamlessly to each other and hunt down any piece of data.

Seemed as though we were getting closer to the STP vision. We could now go from the minds’ imagining a new product or offering, written down in prospectuses or articles of incorporation or a Master Agreement (really processed in word form) into a prescribed data language so that computers could seamlessly update reference databases and trades could be assembled seamlessly from those component parts.

We could now see a clear vision of arriving in STP land, or could we? Turns out that along the way we forget to standardize on a common language for the data content, common IDs, common data tags, common data content, etc. We had done this locally -- a version for each vendor, firm, market, even sovereign jurisdiction, when what we needed was a global standard in keeping with the global nature of the financial system. We needed a common language that was universal, so transactions could seamlessly flow across the globe and be processed and aggregated by computer. Instead we had built a Tower of Babel or a Rube Goldberg device (take your pick) for huge infrastructure edifices and mapping systems, and for mapping infrastructure facilities shared by many that added huge cost and certainly enormous operational risk to the financial system. 

Allan is President and founder of financial industry joint venture development company Financial InterGroup Holdings Ltd; and strategy & acquisition consultancy Financial InterGroup Advisors. The companies are engaged in the capital, contract, currency, cash and investment ... View Full Bio
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
Greg MacSweeney
50%
50%
Greg MacSweeney,
User Rank: Author
5/29/2014 | 1:58:34 PM
Finding value in STP
the previous STP movements stalled becasue it was more costly to continue to implement STP technology, and the value to the business was lower than the cost of the STP projects.

 

Will the same be said for STP in a couple of years again, once firms commit to RDA and are able to get a complete grasp of their financial risk?
allang119
50%
50%
allang119,
User Rank: Author
5/30/2014 | 2:26:52 PM
Re: Finding value in STP
First, there were only two studies that I know of that made estimates of the value of STP to the 'industry'. Tower Groups Global STP Study in 2001 pegged the lack of STP at $12 billion. A STP related SIA/Capco/Accenture study on T+1 in 2000 concluded that T+1 would result in annual cost savings of about $2.7 billion—and would cost the industry $8 billion to achieve, a three-year payback at the industry level. This later study also identified Reference Data as the second most costly inhibitor of STP.  Our study done on 2004 data and updated in 2012 pegged the duplicate annual costs of reference data alone amongst the largest financial institutions on average at $ 1,318 - $ 2,322 billion. There are now some 49 SIFIs identified to date, those being the largest and most interconnected global financial institutions that could be the significant benefactors of STP savings.  That would put the industries duplicate spend just on reference data at $50+ billion.

Second, there is now regulatory compulsion at the root level for STP. It is driven by the need to aggregate data for systemic risk analysis.It is a G20 inspired global identification system for financial market participants and the products they trade. If the global commercial supply chain can have its equivalent of STP based upon a global ID system (the codes populating the ubiquitous bar code, RFID and QR matrices) we should be quick to leverage this effort and move to reengineer our financial institutions. Tech savvy companies like Walmart, Amazon and Fed Ex have gained significant STP efficiencies.  

In the final analysis the greatest inhibitor to STP is the business silo mindset that inhabits the large financial institutions. Enterprise-wide initiatives of this magnitude can only be the province of the CEO as that is where it all these separate  expense budgets comes together. It would be a waste of billions if the industry would have to wait for another generation of CEO, one that comes from the technology savvy world, to tackle the STP problem.  
IvySchmerken
50%
50%
IvySchmerken,
User Rank: Author
5/31/2014 | 12:13:24 PM
The Push for T+2
Allan, first I want to thank you for writing this amazing, comprehensive history of STP that spans decades, numerous asset classes, standards initiatives and presents a vision of where STP must go.

One initiative you didn't mention was the Global Straight Through Processing Association - late 1990s- comprised of leading buy side, sell side firms and Omgeo, if i'm not mistake. It was focused on automated trade matching. I'm not sure what the group accomplished but it ultimately fizzled out.

Regulators are now driving the urgency of STP to prevent systemic risk.  DTCC is now urging the industry shorten the settlement cycle to T+2. You also talk about the need for synching the settlement cycles across different asset classes. Are you in favor of T+2? How hard will it be to get there? And why not go straight to T+1 like futures?
allang119
50%
50%
allang119,
User Rank: Author
6/2/2014 | 1:28:31 PM
Re: The Push for T+2
     IVY:

            Notwithstanding the perception of futures trading being far ahead on STP, there are manual steps in the process that is performed over disputed trades that get 'adjudicated' prior to the opening of the next mornings' trading day. Basically through the traders' reconciliation out trade ethos "I'll eat this one today, you owe me the next one".  It is also much easier to trade, clear and settle in vertically integrated venues as is the futures industry.  Equity markets trade in multilateral venues so more time is required to put the pieces together, reconcile each piece to the other side of the trade, novate and clear each trade, then send it on for depository or custodian trusteeship. Truth be told, in the futures industry that piece, the post trade allocation process of multiple CTA's (Commodity Trading Advisors) transacting through multiple clearing FCMs (Futures Commission Merchants) that advise and execute on a single portfolio is not well automated and is a very important next step for futures markets in order to get to STP. This is the equivalent of multiple investment managers transacting through multiple executing brokers in equity trading. In futures markets, many administrators still have to piece together multiple segments of trades done by multiple CTA's through paper confirmations in order to produce statements and K1s per client.

Now to the ambitious STP effort of 'GSTPA Axion 4'. It was developed by SegaIntersettle- the Swiss Depository; SWIFT- the financial messaging network; and TKS Teknosoft - a Swiss-Indian software development joint venture. The industry raised $100 million from 100 financial institutions globally and issued an RFP. This joint venture won the bid against DTCC and others.  It failed because DTCC persisted after they lost, to leverage their existing legacy equivalent although less ambitious trade management system. Initially DTCC faced off against Thomson Financial who too had a less ambitious legacy trade manager activity. Both existed and were operational. So legacy best practices of the past, not a new best practice for tomorrow, won the day. Eventually under threat of patent infringement claimed by Thomson against DTCC, they agreed on a joint venture and named it OMGEO. OMGEO pushed GSTPA Axion 4 aside, even though it was set to create a more real-time matching flow manager as each side of a transaction was prepared and entered into the matching engine as it came available.  It showed promise as a mechanism that could evolve into real-time STP and shrink the settlement cycle.  Its success required global reference data to be standardized around its own developed standards. 

The true test of STP was to come in the Greenfields opportunity presented by regulators to automate the OTC swaps markets. SEFs (swaps execution facilities) aka swaps exchanges was where swaps contracts were to trade in a multilateral environment, forwarding their executed trades to a SDR (swaps data repository) of its chose, and do this in T+1 time. Unfortunately the precursor to this potential, a global identification system for counterparties and contracts, and the reference data associated with each,  got off to a rocky start (to be kind to the dysfunction that is now apparent to everyone). It is currently stalled in regulatory indecision. Industry practitioner comments have been requested publically by the CFTC and by the global standards body, the Financial Stability Board.  The industry needs to help regulators, first not to miss this opportunity, and second to show that STP is possible in this new market, having learned our lessons of past missteps based on legacy thinking. Cooperation before competition is necessary.

STP requires globally unique identification of financial market participants, of the contracts and instruments they trade in and the financial events (mergers, acquisitions, et al) that change both. All these have a source or origin, the financial market participant itself- as counterparty, as issuer, as originator of a corporate change of control. Financial market intermediaries, whether data vendors, market infrastructure utilities, or financial institutions that interject themselves in this flow add risk to the system, additional infrastructure costs, and basically imped the realization of STP.

With a global identification system and standard reference data and data tags in place we can realize the vision of STP, no sooner and no other way. T+2, even T+1 should be intermediate goals to ready the industry on the way to a real-time order-trade-clear-settle-pay mechanism across all asset classes. The design of this solution cannot be accommodated in single markets, or by single market infrastructure utilities, a global design needs to be drafted, geared to attain real-time STP. Only when we know how we are to attain the final real-time STP solution does intermediate solutions like T+1 and T+2 make sense. The good news is we have a global regulatory standards initiative underway – we just have to make some needed adjustments, significant ones I might add, and we have the pillars upon which to build a final STP solution.   Visionary industry leadership is required.                                                                           
More Commentary
Don't Let the Cloud Rain on Your Operations Strategy Parade
Avoid migrating large applications all at once to minimize risk during a cloud project.
Could Intel Lose Data Center Market Share to ARM Chips?
ARM chips could be an alternative for certain purposes in the datacenter, but many questions have to be answered before they pose a threat to Intel's market dominance.
Cost to Trade: Hey, Banks, Itís Time to Face the Music
Why is calculating the cost to trade so difficult for banks? The answer is as complex as the calculations themselves.
M&A Activity Will Continue to Grow in 2015
Data shows that the M&A market continues to improve, and forecasts indicate deal making will be healthy in 2015.
Chief Data Officers: Organization Strategy & Cultural Change
Chief data officers are new to the financial services C-suite, but they are facing a number of challenges, including the need for new data governance and execution strategies, staffing, and new organizational structures to enable cultural change.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - Elite 8, October 2014
The in-depth profiles of this year's Elite 8 honorees focus on leadership, talent recruitment, big data, analytics, mobile, and more.
Video
5 Things to Look For Before Accepting Terms & Conditions
5 Things to Look For Before Accepting Terms & Conditions
Is your corporate data at risk? Before uploading sensitive information to cloud services be sure to review these terms.