$('#rightsidebar').insertBefore('#content');

Risk Data Management Lessons from Bank of America?

"Financial Services is a data industry. We are not manufacturing something and sending it off the assembly line, and while currencies are physical, most of it is book entry. The firm is data” says Chief Data Officer, Peter Serenita of HSBC. 

So as data practitioners we wanted to find out what lessons can be learned from the recent $4 billion accounting error at Bank of America. 

Let me start with a couple of confessions. First, I am neither an accounting or a treasury professional nor am I trained in instruments like “structured notes”.  However, an operational risk incident of this magnitude in a data industry that led to setbacks (suspension of dividends and buybacks, refiling CCAR, etc) sparked our curiosity to see if data management could have played a better role in preventing this risk.

Second, our focus is mainly on what lessons can we learn and not on who did it or where it happened. After all risk management is about learning from past mistakes and not repeating it. Isn’t it?  So here are our musings that we thought are worth sharing.

A simplified view of the problem. The value of select instruments go up or down based on changes in external factors such as external credit ratings. So when external factors change, their values have to be updated/adjusted accordingly. In this case the value of instruments in question was inversely proportional to the credit ratings and it appears Bank of America didn't adjust the capital to a lower level after the credit rating was raised post acquisition of Merrill Lynch 5 years ago.

Here is an example (perhaps a crude one) that may help understand the issue. Let’s say your tax accountant is unaware of your new higher mortgage interest rate and payments. Numbers based on old rates and payments will be off and the accountant's estimate of available balances will be much higher than what you have truly available. Right? Even though your accountant can rightfully claim that the calculations were accurate based on what they knew.

If you are interested in a deeper analysis on capital calculations, we found these articles interesting and informative.

  1. Bank of America, nothing complex, it’s noblesse oblige!
  2. Bank of America Finds a Mistake: $4 Billion Less Capital


Thoughts and findings:

  1. Fixing the quantitative part might be easier than meeting the qualitative requirements of CCAR within one month. The part of establishing an effective risk management framework and processes to address operational, credit, market, and legal risks might be harder given that the issue went undetected for 5 years. In a previous edition , we discussed the qualitative aspects of CCAR.
     
  2. There were quite a few bank M&A's in the last 5 years, so this is a very good reason for other banks to verify their own calculations and effectiveness of risk management processes.
     
  3. Are we missing an important aspect of data quality? We think so. A “Correlation" dimension to identify and tie internal risk data elements to external factors whose values are inter-related. 

    Applicable only to a special set of attributes that need to be “recalculated/refreshed” every time the external factors it depends on changes. Without accurate correlation subsequent derivations, or data quality checks might not be effective. After all, data quality checks must have been in place at Bank of America on the elements used for capital adequacy calculation as it is a regulatory requirement but they could not detect the issue. Right?

    Most data quality checks are on data elements internal to an organization, so if and when they depend on external factors, they need a separate set of controls and checks.
     
  4. Establish a hierarchy/sequence for data quality checks by importance. Perhaps in this order: a. Co-relation checks b. Business data quality rules c. Accuracy d.Integrity and everything else.


Principles of Effective Risk Data Management

Speaking of regulation and data quality, the 14 principles of effective risk data aggregation and reporting were published by BCBS in January '13. While it is intended for the financial industry, every business can benefit from implementing them. In January '14, a status report on the state of data quality was published by the Senior Supervisors Group and here are some excerpts from it.

  1. Five years after the financial crisis, progress on timely and accurate counterparty risk measures has been largely unsatisfactory.
     
  2. While firms have made progress in certain key areas of counter- party risk management, on the whole current practices fail to meet supervisory expectations or industry self-identified best practices for timely and accurate reporting of top Counterparty exposures.Read the complete article here.
     
  3. Recurring data errors indicate that many firms are below SSG benchmark standards for data quality and cannot measure and monitor the accuracy of thedatathey submit or rectify quality issues in a timely manner.
     
  4. Firms’ difficulties producing the Top 20 Counterparty report accurately and on time may reflect their inability to aggregate exposure to all counter parties.
     
  5. While some firms involved in the project met all supervisory expectations for timeliness and frequency, data aggregation capabilities, and data quality, others failed to make as much progress as expected. Going forward, supervisors will expect firms to continue to devote time and attention to the infrastructure necessary to aggregate and update exposures accurately and in a timely manner. This includes the ability to thoroughly review data quality and trend analysis to identify data anomalies.


Click here for the complete report.

In light of the above findings, banks and financial institutions will need to implement master data management (MDM) solutions to have a single, trusted version of their counter-party data.

Master data is data that provides context to transactions. For e.g., when you buy a product at a store, Customer, Product, Store, Geography, and Time are few examples of master data that provide context to the transaction. An MDM solution includes Data Governance processes, Data Quality, and tools for matching, merging, de-duping, enriching, and publishing accurate versions of counter-party data to consuming systems. It also goes by Party master, Customer master, or Customer Data Integration (CDI) in other industries.

Comments or questions? Send email to aikya at aikya-inc.com.