$('#rightsidebar').insertBefore('#content');

Special Edition | Enterprise Data Management at the Federal Reserve Board

Welcome to the special edition of our Data Management newsletter. 

In this edition we have a very special guest who is going to share data management initiatives at the Federal Reserve Board. It is my pleasure and honor to speak with Sandra Cannon, the Deputy Chief Data Officer at the Federal Reserve Board and learn about the enterprise data management initiatives at the Federal Reserve Board since the financial crisis of 2007 and 2008. 

While the banks and financial institutions have been implementing risk management frameworks including Basel II/III, LCR, NSFR, and complying with DFAST, we were very interested to learn about the data management initiatives underway at the Federal Reserve Board to meet the increasing and dynamic data needs in support of the Board’s mission. First a little bit of background on the Fed. 

Background: The Federal Reserve is the central bank of the United States and the primary goals of the Fed include: 

  1. Conducting the nation’s monetary policy by influencing the monetary and credit conditions in the economy in pursuit of maximum employment, stable prices, and moderate long-term interest rates 
  2. Supervising and regulating banking institutions to ensure the safety and soundness of the nation’s banking and financial system and to protect the credit rights of consumers
  3. Maintaining the stability of the financial system and containing systemic risk that may arise in financial markets
  4. Providing financial services to depository institutions, the U.S. government, and foreign official institutions, including playing a major role in operating the nation’s payments system 

Click here for an overview of the Federal Reserve including its purpose and functions. 

What was the data management environment at the Federal Reserve prior to the financial crisis of 2007/2008? 

San: Prior to 2007/08 Data management within the Board was essentially a decentralized function. Data resided within a unit or division and there was minimal cross division data sharing. We had two professional data management groups supporting and providing micro and macro data for the research needs of the Board. During the financial crisis, we quickly realized that more people needed access to more data and quickly. Lack of formal procedures and guidelines across the enterprise meant that well-intentioned activities might not be necessarily the right thing to do. A silo-ed approach to data management meant inconsistency, incomparability, and frustration for the users. The effects of were increasingly seen between August 2007 and June 2009. 

What were some of the steps that the Board undertook to address the issues at strategic, tactical, and operational levels? 

In May 2010, we created Board Data Council (BDC) to share information on data issues and challenges. Data management groups in the Research division were restructured primarily into an applications group and a data group. 

In April 2011, a Chief Operations Officer was hired who heard echoes of dissatisfaction with the data landscape. In September 2011, our strategic planning process began. Instituting Enterprise Data Governance was one of the 6 strategic initiatives recommended. A discovery process was initiated to understand the current state of data and architecture, pain points, process issues and gaps. The findings included:        

 

  1. Few consistent policies        
  2. Limited formal coordination between divisions on data         
  3. Highly variable data management         
  4. Constrained data environment         
  5. Long standing silos with subject matter experts maintaining “their” data will not scale to handle the new data landscape. 

The recommendations included creation of a Board-wide data governance and management structure that supports the growing quantity of data and need to share the data, while ensuring the flexibility and nimbleness required and expected by some of the key stakeholders in the data landscape.  For example, the research community felt it was important to maintain each division’s ability to acquire data and manage their data budgets; and the supervisory community felt strongly about the ability to maintain intellectual ownership over the content and timing of data collections. 

What was the outcome of the discovery process? 

We have taken some key steps along the way and below are some of them. 

  1. We hired a Chief Data Officer to craft guidelines and policies for good data hygiene practices across the Board. The position reports to the COO, works with and chairs the BDC and coordinates with any System or other data groups. Micheline Casey became the Board’s first Chief Data Officer in May 2013. 
  2. Reconstituted the BDC with senior level membership and responsibility to approve those policies, represent their Division’s concerns to the CDO, and relay policy decisions to their data professionals. 
  3. Together, they set scope and standards for data governance
  4.  Groups from the research area that had been providing services across the Board are moved into the Office of the Chief Data Officer to formally become enterprise service providers. 
  5. OCDO scope includes operating authority for data governance and data management within the Board and across Board delegated functions. 
  6. OCDO focuses on enterprise data supporting the primary missions of the Board including monetary policy, financial stability, supervision, consumer protection, and economic research. 

We have embarked on a Board-wide data management journey to ensure we have the right processes and controls in place to support the complete lifecycle of data. The Office of the Chief Data Officer (OCDO) is working to put basic data hygiene practices in place, create consistent data standards and procedures, and define roles, responsibilities, and accountabilities as part of a comprehensive data governance program.  

We are creating an information architecture practice to provide a conceptual and logical framework to allow for the scalability to meet current and future needs.  As an enterprise service provider, the OCDO will need to have the right SLA’s in place meet or exceed our stakeholder expectations, a framework to measure and monitor data management goals on a consistent basis. 

Who does the CDO report to? 

The Chief Data Officer reports to the COO, as does the CIO. The CDO and the CIO are peers, working collaboratively. 

What are the main functions, services, and goals of the OCDO? 

The service portfolio of the OCDO can be broadly categorized into the following 5 areas. 

  1. Governance
  2. Information Architecture
  3. Data Management
  4. Program Management
  5. Advisory Services 

 

  1. Data Governance services include setting data standards and policies for the entire lifecycle of data, establishing and maintaining high levels of data quality, supporting Board’s Risk Management initiatives, and clearance and intake process. 
     
  2. Information Architecture services include the creation of the Board’s strategic and conceptual information architectures, performing business analysis and data flow modeling in support of data governance and data management activities, and, enterprise meta data and taxonomy development and management. 
     
  3. Data Management includes the acquisition, integration, and distribution of data from internal and external sources. Overall data management addresses the complete lifecycle of data activities from data acquisition, integration, processing, reporting, to data archiving. 
     
  4. Program Management includes project management, change management, communications, training and education in support of OCDO activities and data management efforts. 
     
  5. Advisory services include providing expertise around content, help desk support, and industry best practices. The OCDO is still in its first year and is taking rapid strides in creating a sustainable office, modernizing core foundational activities, meeting our customer service goals and fulfilling our stakeholder obligations. 

This includes communicating regularly and frequently across the Board and the Federal Reserve System, engaging in planned and ongoing outreach initiatives, revisiting and re-engineering our current business processes to improve our ability to provide service across the enterprise, and hiring information architects, business analysts, data governance specialists. 

When banks and financial institutions submit their daily/weekly/monthly data reports including LCR and other Basel reports how does the Board integrate them and ensure data quality? 

The Federal Reserve System comprises 12 Federal Reserve districts across the country that support the function of acquiring, integrating, and distributing data from different banks and financial institutions. Data quality rules and validations are performed at the Reserve Banks who may do the initial collection to ensure data meets consistency, completeness and accuracy requirements. Data quality checks are also performed at the Federal Reserve Board level once the data are transmitted from the District Banks. 

Does Data Governance at the Board extend to the 12 Federal Reserve Districts as well? 

The data governance program being developed by the OCDO is relevant for Board data assets that are data at the Board, and data that may be managed or housed in the Reserve Banks under delegated authority from the Board.  We would like to be a resource to those across the System interested in data governance and data management best practices. 

Thank you very much San for your time and inputs. We believe data management practitioners across industry lines can benefit from your experience and insights. We really appreciate it! 

San oversees groups responsible for the reports clearance process, the maintenance of data collected from institutions and procured from private and government data providers. She is active in the international data community; especially the academic and central banking spheres, and works closely with US statistical and regulatory agencies on data and metadata issues. San is currently working to help stand up an enterprise data service and help create and implement an enterprise data governance program through the office of the Chief Data Officer. San has Ph.D in economics from the University of Wisconsin-Madison and M. Sc., Economics from London School of Economics and Political Science 

Please note the views expressed here are not representative of the Board of Governors. This article includes select excerpts from a DAMA conference presentation.