Saturday, August 23, 2008

Data Quality for Enterprise Risk Management

Data Quality for Enterprise Risk Management


Ramchandra, Vikram


Abstract


Enterprise risk management (ERM) is gaining renewed prominence with the advent of regulatory initiatives such as the Sarbanes-Oxley Act (SOX) and Basel II. ERM requires high-quality, integrated data for making risk-based decisions. A data management system is the foundation for providing such data. Data quality is a key component of data management, ensuring decisions are based on fit-for-purpose data. By developing a data-quality program focused on business initiatives and treating data as a true enterprise asset, high-quality data for initiatives such as ERM can be assured. This ensures compliance and propels the enterprise toward achieving competitive differentiation.


Introduction


Enterprises are affected by low-quality data, but unless these data-quality issues are tied to specific business impacts, it is common for organizations to live with it. Consider the business impacts caused by low-quality data in these real-wo rid examples:


Inaccurate SIC data resulted in a misclassification of customers for a commercial bank that resulted in faulty exposure estimates. Data inaccuracy hid the homogeneity of the bank's customers even though the bank thought its customer base was well diversified. When the natural gas market suffered, this commercial bank incurred major losses because it had a large number of accounts in the southwest.


At another bank, a single customer had ten different identifiers in the loan origination system. The defaults in one account were not related to the remaining accounts because there was no common customer-identifying information. The result: a risk that could have been identified earlier and controlled was left to spread and affect all the accounts.


* A financial institution had 35 different account status codes, some of which did not actually provide status information. Only a few bank associates knew the actual meaning of these status codes, resulting in incorrect credit risk assessments and grade assignments for these accounts.


Several inter-related reasons caused these problems. The complexity of today's business and information technology (IT) environments is growing. IT now combines legacy systems, enterprise applications (ERP, CRM, and others), homegrown applications, and a plethora of data repositories. Businesses are acting and basing critical decisions on numerous internal sources of data across the enterprise and also a large amount of externally available relevant data.


Along with the data-quality issues, the failures cited highlight another systemic issue-the absence of enterprisewide risk initiatives. If the organizations viewed risk with an enterprise focus, the processes that generated the poor-quality data might have been identified and corrected at an earlier stage. As it will be clear from the classification of risks in the following section, poor data quality is essentially an operational risk for the enterprise.


Enterprise Risk Management: Key Facets


In simple terms, ERM is a set of processes that identify and control risks arising from the enterprise's business activities. Surrounding the identification and control are a set of metrics that can be used to measure the efficacy of these processes. There are many similarities between ERM and a manufacturing lead-quality initiative in the late 1990s called total quality management (TQM), which focuses on enterprisewide activities and processes to improve overall quality of products and processes.


TQM moved away from a "siloed" effort of achieving quality (by making it the focus across the enterprise) toward placing the responsibility with a quality assurance department. Like TQM, ERM can only succeed if it starts with business backing and a business vision. ERM is an enterprisewide activity, not the responsibility of a solitary risk department. ERM cannot be an IT-led exercise, even though IT serves as a key ally and stakeholder in implementing and maintaining ERM.


What are the different types of risks faced by an organization? Table 1 shows a classification of risks along with a brief definition. Many of these risks are interrelated; for example, large negative impacts from the top eight always create a risk to an organization's reputation.


Since ERM requires an enterprisewide view of information, the activities of gathering, harmonizing, analyzing, and presenting risk information gains highest priority in this initiative. Data management accomplishes these tasks and therefore is a key facet of an ERM solution. The ERM components cover a variety of risk processes, broadly encompassing the definition, identification, assessment, mitigation, or management of risk, and the continuous monitoring/communication of risk metrics.


While data management provides the content for risk management and the risk methodologies provide the structure for managing the content, corporate governance provides a set of guiding principles that reflect the enterprise's risk-management philosophy, operating culture, and core values. It is an often overlooked component that has significant impact on the functioning of risk methodologies.


The following sections will provide additional details about ERM components and data management.


ERM Components


We will use the COSO (Committee of Sponsoring Organizations of the Treadway Commission) framework to briefly review the key components of ERM.


1. Internal environment is the organization's philosophy for managing risk (risk appetite and tolerance, values, etc.)


2. Objective setting identifies specific goals that may be influenced by risk events


3. Event identification recognizes internal or external events that affect the goals


4. Risk assessment considers the probability of an event and its impact on organizational goals


5. Risk response determines the organization's responses to risk events such as avoiding, accepting, reducing, or sharing


6. Control activities focus on operational aspects to ensure effective execution of the risk response


7. Information and communication informs stakeholders of relevant information


8. Monitoring continuously evaluates the risk management processes


Data Management for ERM


For compliance-driven risk programs such as Basel II or SOX, data requirements play a central role in dictating the risk architecture. Specifically, consider Basel II; it provides a set of guidelines to financial institutions to perform risk-based capital calculations. To comply with these guidelines, banks must show they have the data (and up to seven years of history) required to calculate risk metrics such as probability of default, loss given default, etc.


Besides a focus on data, Basel II guidelines require an organization follow a robust set of data-management practices to ensure the underlying data quality; it also demands techniques to trace the lineage and perform calculations based on historical data.


Data management is a set of policies and procedures that occur over the complete life cycle of data starting from data generation and conversion of data into information through the archival or discarding of data. It consists of two major components: the data content itself and the accompanying infrastructure. For example, with Basel II, content includes Credit Ratings, Obligor, Credit Facilities, Credit Reports, Basel Asset Classes, Loss and Recovery, and Collateral, among other categories. The infrastructure includes all the processes and technology used to collect, manipulate, and store the data, such as data movement, data storage/placement (data warehouse), metadata, data governance, data quality, data modeling, and data archiving. Thus, data management plays a key role in providing quality, integrated data, and information for a successful implementation of an ERM initiative.


Data Quality and Risk Management


The methodologies and processes highlighted in ERM components are only as good as the data that they rely on. According to one recent estimate, poor data quality costs U.S. businesses more than $600 billion a year. (Eckerson, 2002) The business impacts are more acutely experienced in any enterprisewide initiative-such as ERM. It is, therefore, imperative that any ERM initiative be tied into a data-quality initiative.


We present the nine key steps to successful deployment of a data-quality program for an ERM initiative using Basel II as the backdrop.


Step 1. Identify the data elements necessary to manage credit risk. Identifying all the data elements and sources necessary to calculate enterprise risk is no mean feat. Risk data such as probability of default (PD), loss given default (LGD), and exposure at default (EAD), for example, can each require the identification of several different data attributes. Further complicating this effort is that data elements may be spread across systems, databases, and divisions within the organization. If an organization does not have a mature data-management or data-governance framework, this task becomes complex.


Step 2. Define a data-quality measurement framework. We know that the integrity of the risk calculation requires data to be of high quality, but how do you define data quality? In our experience, a formalized approach to defining data quality, based on a number of dimensions, can form the basis of a data-quality measurement framework.


The key dimensions that data quality traditionally measures include completeness, conformity, consistency, accuracy, duplication, and integrity. In addition, for risk calculations, dimensions such as continuity, timeliness, redundancy, and uniqueness can be important. The dimensions generally harbor a multitude of sins we most commonly associate with poor-quality data: data-entry errors, misapplied business rules, duplicate records, and missing or incorrect data values.


Each attribute identified in Step 1 must be tied to one or more data-quality dimensions detailed in this list. For example, you may decide it is important to measure the completeness of the "Probability of Default" attribute and the conformity of the "Maturity Date" attribute. This framework, once completed, acts as a common and easily understood language across the organization in both business and IT. It is also possible to "roll up" these data-quality dimensions across attributes to a database, application, system, or enterprise.


Any initiative that goes across divisions requires such a common language framework. It may be required to standardize the data and define the data through a governance process to arrive at a common list. Figure 2 shows a sample list of attributes chosen to be measured in a credit-risk-management environment.


Step 3. Institute an audit to measure the current quality of data. Perform a data-quality audit to identify, categorize, and quantify the quality of data based upon the decisions made in the previous step. Depending on the data size, a sample or the complete data can be used to perform a data-quality audit. It can be highly useful to produce a data-quality scorecard at this stage. The scorecard provides a metric with which to set data-quality targets. A sample scorecard is shown in Figure 3.


The data-quality audit also allows an organization to build its business case for further action. An extrapolation of the problems found in the data sample can quickly identify and quantify the impact of poor-quality data.


Step 4. Define a target set of data-quality metrics against each attribute, system, application, and enterprise. Based on the audit results and the impact that each attribute, application, database, or system will have on the ability of your organization to manage risk, the organization should define a set of data-quality targets for each attribute, system, application, or enterprise. This task is clearly not just IT driven; it is a business and, indeed, an executive issue. The common language we described is tremendously useful when dataquality targets are being discussed and agreed upon.


Step 5. Set up an enterprisewide data-quality monitoring program, and use data to drive process change. In this step, the organization will build the business rules required to monitor the various attributes and dataquality dimensions identified in the prior steps. Data can be monitored:

* At the time of acquisition

* When data passes into and out of a department or a set of applications or systems

* When data is moved in and out of geographies

* When data is moved in and out of the enterprise

* At data repositories


Once built, the data monitoring rules can be deployed globally across applications, systems, and geographies. This step helps an organization understand the nature of the data-quality problem and provides insight into where these problems arise. Measurement and monitoring are central to all quality-management systems, and data quality is no exception. Enterprises that systematically measure data quality in this way provide a foundation for effective data-quality improvement and for understanding the process changes required to improve the quality of data. Data stewardship programs, where business owners are made accountable for key data elements, are a proven way to ensure process change.


Step 6. Identify gaps against targets.

Once a data-quality monitoring program is instituted, an organization can quickly identify the gaps against the targets defined in Step 4. This leads to a specific and focused effort to improve the quality of data and measure results.


Step 7. Institute an automated cleansing effort.

This stage creates the improvement processes for reaching data-quality objectives, creating and implementing business rules to cleanse, standardize, and transform the data. Improving the completeness and accuracy of data requires having access to the appropriate reference content. When the processes are defined, test them against representative data extracts before applying them to the full dataset.


Step 8. Identify and implement the business and process changes required to improve quality.

The ultimate goal of data-quality management is to institute change within the organization to ensure data quality is maintained in the long term. This may require alignment of various groups, especially those responsible for upstream data production (e.g., call centers collecting data for operational use must consider other enterprisewide uses of the data). Financial incentives and budgeting priorities may be needed to implement these changes. Reporting and data-quality monitoring are key tools for helping to ensure that data quality is embedded in the organizational culture.


Step 9. Set up a continuous monitoring program.

Data quality degrades with time as new processes, applications, and systems come on board. It is important to institute a continuous monitoring program to ensure that data quality is sufficient to meet the enterprise riskmanagement goals.


In our experience, these steps ensure a successful deployment of a data-quality program.


Conclusions


High-quality data is a critical success factor in enterprise-wide programs such as ERM. Conversely, data-quality initiatives thrive when business initiatives such as ERM require improvements in the quality of the data.


Use initiatives such as ERM to improve and implement data management and data-quality initiatives. Successful data-quality programs are crucial to mitigate operational risks associated with implementing a complex initiative such as ERM.


Arriving at a common language and framework between business and IT can help both data-quality and ERM initiatives. The detailed data-quality methodology presented here can be used to jumpstart data-quality programs in the enterprise.


REFERENCES


AIM Global Data and Risk Management Survey 2005. Available at http://www.aim-sw.com/topics/news-dmstudy-2005


Crouhy, M., D. Galai, and R. Mark. The Essentials of Risk Management, McGraw-Hill, 2005.


Eckerson, Wayne. Data Quality and the Bottom Line: Achieving Business Success through a Commitment to High Quality Data, The Data Warehousing Institute, Febuary 2002. Available at http://www.tdwi.org/research/display.aspx?ID = 6045


Enterprise Risk Management-Integrated Framework. Available at http://www.coso.org/Publications/ERM/COSO_ERM_ExecutiveSummary.pdf


Financial Compliance and Data Quality, Research Paper. Available at http://www.similaritysystems.com/download.php


Krishna, Dilip R. "Enterprise Risk Management: Illuminate the Unknown," Intelligent Enterprise, (December 2005). Available at http://www.intelligententerprise. com/showArticle.jhtml?articleID = 174300345


Vikram Ramchandra is a director of alliances in Informatica's data quality division (formerly Similarity Systems).


vikram.ramchandra@informatica.com


Sreedhar Srikant is a senior data warehouse consultant in the Teradata Financial Services Risk Center of Excellence.


sreedhar.srikant@teradata-ncr.com


Copyright Data Warehousing Institute Second Quarter 2006
Provided by ProQuest Information and Learning Company. All rights Reserved

No comments: