The recovery is under way. Banks are strengthening their balance sheets and returning to profit. With this encouraging news, firms have a prime opportunity to restructure technology and strengthen internal data management capabilities.
These improvements will provide banks with the agility required to meet future business strategies and ongoing rapid changes in market conditions. But without the ability to manage quality of data with a consistent approach, how can these improvements be achieved?
During 2008 and 2009 firms and banks witnessed how a lack of transparency and timely data flow throughout an organization contributed to the economic downturn. The events of the Sept. 15, 2008 (when Lehman filed for bankruptcy), among others, show that firms need to be able to monitor, assess and react across business lines within the day -- and they need the correct data to do this.
Maintaining data in silos and duplicating that data across front-, middle- and back-office systems independently has limited firms' ability to maintain an accurate view on their positions. They are without the real-time data required to make tough decisions on when to increase or pull funding to a business line. But where does this problem start?
Currently, the middle office accepts data sent from the front-office system and external data feeds as being accurate and complete. But this isn't always true. To perform intra-day analytics, the middle office builds a new aggregation platform with data that is not always completely reliable. Using that incorrect or incomplete data is never going to result in analysis that is accurate enough for solid business decisions. For example, a global organization might have data from Asia-Pacific (APAC) but not from Europe or North America because of the time of day; that data will need to be updated when available and the correct data moved throughout the workflow. That change could impact the analysis.
An Accurate Workflow
Decisions made throughout the day often are based on inaccurate data. Firms must put in place a process that enables data accuracy throughout the workflow. As data is created, it should be validated and moved with any corrections made in near real-time.
A real-time data flow minimizes the amount of corrections needed over an extended period of time with a more consistent view of data over the front, middle and back office. The middle office must be responsible for this process, as often it's the source of records for auditing or regulatory reporting.
The middle office has a unique position within the firm to provide an environment of record that oversees data quality, metadata and standard analytics across businesses and geographies. The net result would be a simplified data processing environment with a clear and defined view of data, which provides much stronger capabilities to the business for analysis, faster time to market, and improved and consistent understanding of a firm's position from the desk level to the C level.
Firms have a unique opportunity to take a step back and look again at how important data management is to support their business agility, but also to ensure we never end up in an economic situation similar to 2008/2009. Firms must use accurate data to stay protected.
Stuart Grant has more than 10 years' experience within the market data industry working in product management and business development to provide market data and data feed solutions to buy-side and sell-side organizations to support business-critical investment processes, including quantitative, risk and performance analytics. Having recently joined Sybase from Thomson Reuters, Grant is focusing on business development within financial services, with a key focus on enabling firms to create a holistic data management platform.