Data Management:

Credit Crisis Makes Data Management New Priority, Baring Risk-Related Issues Swept Under the Carpet

Originally published March 16, 2009

NEW YORK — With the stronger focus on risk management, the underlying management of data needed to accurately assemble risk views is starting to transform, say data management systems providers and financial industry observers.

The need for greater transparency on derivatives and complex instruments that proved risky in the economic downturn is placing greater emphasis on securities pricing data, industry participants and providers say. Operationally, firms and providers are moving away from the data warehouse method of handling data, as well as rolling up information and operational functions into single systems that handle more functions and more types of data.

The size and complexity of many major financial institutions adds to the challenge when those firms are using systems that have been in place a long time. “The process of moving data from a central warehouse down to an application, and transforming it to fit legacy environments, is hard,” says Michael Atkin, Managing Director of the Enterprise Data Management (EDM) Council, a nonprofit trade association focusing on managing and leveraging enterprise data.

“Ultimately, our customers want to know where their exposure is,” says Richard Enfield, Business Owner, Asset Control Plus, at Asset Control Systems Inc. “In the credit crisis, as the dynamics of institutions change, the people who are the ultimate obligors for the instruments are changing. You might have exposure to companies from a variety of places. You may have direct exposure to them because you own an instrument that they issued.”

The complexity of determining exposures led Asset Control to identify where various “pockets” of information are stored, explains Enfield. “It’s amazing how many different departments have pieces of data, yet holistically, nobody has it all,” he says. “The data issue that has arisen out of the credit crisis is very large and people have to look at it in a whole new way. Things they do operationally can impact the value of their investments. Tiny pieces of data can have massive implications on valuation and exposure in an organization. Being able to get that information to the right place at right time is critical. Establishing the links so you know when two entities that you’re dealing with two different places are really the same. That may sound easy, like both of them would have the same name, but to a computer, one might be ‘00629’ and the other is ‘B9427.’ It doesn’t sound hard, but pulling all that together is very difficult.”

The challenge of managing risk in derivatives requires the capability to drill down into these products, according to Gert Raeves, Senior Vice President, Strategic Business Development, at GoldenSource Corporation. “It’s not just about managing a static product lifecycle, but a lot of the counterparty risk associated with being in that contract lifecycle,” he says. “They need to drill down to the counterparty view and also to a position management view. Because there are such long, open cycles in position management, there are obvious mark-to-market and portfolio valuation requirements down the line. That can only be done by linking product definitions and counterparty risk categories with updated transaction and position data.”

The complexity of derivatives adds to the complexity of data, according to Joseph Amarante, Senior Vice President of Professional Services at NorthPoint Solutions, a business IT consultancy that addresses the handling of loans, credit derivatives, swaps and numerous other types of assets. “The issue is not the applications themselves, but the data,” he says. “For CDS [credit default swaps] valuation, most firms use Bloomberg CDS capability, which has its own limitations. For a firm that has 1,000 or 2,000 CDS positions, they aren’t going to type in each one. That becomes unmanageable. For analytics and integration, if someone wants to enter a price and get a spread, or enter a spread and get the price, how will it affect cash flow? There aren’t tools out there to do that.”

Existing portfolio management products are mostly designed for exchange-traded asset classes and as a result, not only do not support data very well, they cannot store and calculate the data, according to Amarante. “The data structures aren’t robust enough, the interfaces aren’t robust enough, the user interfaces aren’t robust enough and the subject matter expertise isn’t robust enough,” he says. “So there’s a triple witching in what the vendor products can provide.”

Data has to be broken down into smaller pieces, according to Kevin Goldstein, Senior Vice President of Business Development at NorthPoint Solutions. “It’s a combination of breaking down data, aggregating it and then rolling it up,” he says. “It goes both ways. If it goes down to the security level, you may bring it up to a pool level if you have a pool of mortgage [securities] or a deal that represents many different assets. You’re breaking it down and rolling it up.”

Another factor in data, particularly pricing data, is that valuation formulas tend to change or struggle in times of great economic stress, according to Enfield of Asset Control. “Ultimately the value of something is what someone is willing to pay you for it,” he says. “In the cases of a lot of instruments, the value is not there. The question is determining what the right price is. That’s a huge struggle now because the models people used are breaking down.”

Pricing, particularly for derivatives and complex securities, requires complex analytics, observes Barry Thompson, Chief Technology Officer of Tervela, a messaging systems provider. “The pricing is done in very large grid or computational lattices that take a very long time to run,” he says. “Macro-level quant data elements go into all the analytics used to generate price. Huge analytic engines are run overnight but the underlying assumptions on macroeconomics are a quarter or a month old. There’s no real-time pricing capability.”

Some are looking to pricing transparency as a solution, with JPMorgan Chase recently open-sourcing its CDS pricing model, notes Enfield. “There are two pieces to transparency — one is consistency, knowing every day that you’re doing the same thing,” he says. “The other is looking through corporate structures to know what the exposure is to, ultimately.”

Firms are also struggling to move data on complex securities into real time, according to Brian O’Keefe, Director of Product Management at Panopticon Software, a data visualization software provider. “People built reports with very static, somewhat old data and it ends up in a business intelligence cube and only a couple people in your organization can write the queries to that cube,” he says. “Once those cubes are built, then the reporting on them is very slow and static. People really want to ask questions about the data. They don’t want to have canned questions presented to them every day or at the end of the day or whenever they get these reports. They want data on the fly.”

The acceptable time horizon for data may be decreasing, although it may not reach real time, according to Leonid Frants, President and founder of OneMarket Data LLC, a provider of enterprise tick data solutions. “When doing risk management, you might not care about seeing your portfolio on every tick, any time any price changes somewhere,” he says. “One second may not matter, but they need to view [data] at a much finer granularity. Moving from daily frequency to any kind of intra-day frequency requires completely different data management. Immediately, the standard systems you use for your data simply don’t work. People want to see more detail of what’s happening.”

Data management is moving away from the data warehouse method, according to Gavin Little-Gill, Executive Vice President, Front Office Solutions, at Linedata Services, a financial IT solutions provider. “There was a view that all the systems were linked together by the data warehouse,” he says. “Now people are looking for a source system for data to be much more accurate and feed through peripheral applications. They ask where the data comes into the application and how to get it right the first time. They demand that we integrate with all the data providers seamlessly for everything from pricing information to security master information.”

The move away from the data warehouse model has raised other questions, observes Ben Keeler, Director at consultancy Citisoft. “Can [investment managers] rely on an outsourcing firm’s data warehouse?” he says. “Who’s responsible for cleansing and interacting with the data from a compliance perspective, ensuring its accuracy and completeness?” Economic woes are affecting firms’ ability to take on large enterprise-level data management projects, however, according to Keeler. “Putting a band-aid on existing systems or doing more tactical work as opposed to a larger warehouse is definitely preferable to many,” he says. “In some cases, our clients are moving ahead with a warehouse, but they’re doing it in a fairly segmented manner.” A centralized data warehouse is no longer absolutely necessary for client reporting, adds Keeler.

 

   
     

Questions or comments? Get in touch with us at info@globalinv.com

© 2005-2009 Investment Media Inc.