Skip to main content

Best Practices For Client & Counterparty Data Management

The role of client and counterparty risk management has assumed greater strategic importance over the last few years in light of heightened regulatory scrutiny and the introduction of new rules around AML/KYC, ultimate beneficial ownership, global data privacy and investor protection rules. With the growth of AI/RPA-enabled compliance processes, the quest for accurate and high-quality data has become the holy grail for corporate and investment banks.

And yet, according to Fenergo’s recent Industry Trends report, 74% of banks surveyed believe that data management is overlooked strategically despite it being among the top three most critical business concerns, with only 15% of respondents stating that they had fully automated the collection of client data. The prevalence of siloed, fragmented solutions has created pools of disparate, unconnected data, resulting in issues around data quality, transparency and visibility of ultimate beneficial ownership structures. One reason for this is disjointed, siloed, semi-structured and unstructured customer data.

While effective client and counterparty data management has become a vital aspect of regulatory compliance, the industry lacks the ability to execute policies, procedures and solutions for a well-designed client and data management system.

The Importance of Well-Executed Data Management 

Many financial institutions are busily trying to remedy the current situation by integrating internal data repositories and external data providers to improve the flow of clean, golden-source data throughout their organization. According to Chartis research, the biggest area of risk tech spend for Tier 1 banks is on risk, governance and integration technology, with $25 billion spent in this area alone in 2018.

Each organization will have different transformation needs and, although there are technologies that assist in these transformations, experience has proven that the data management process should be underpinned by robust, transparent best practice principles that are centred on empowering the business user in the overall process. To aid this process, we’ve compiled a list of key principles, borne out of our practical experience in this area, that drive these processes.

1. Data Profiling

Start with production data as soon as possible! A key component of the data management process is profiling existing data across a number of dimensions. These include completeness and accuracy. Data profiling is predicated on the availability of production data. Basing the data management effort on a snapshot of production data leads to an early understanding of the content and semantics of the source data. In addition, outlier data can be detected early. The transformation process is iterative in nature; therefore, production or production-like data is important. If necessary, the data can be masked to ensure security compliance, however, masking key data elements such as names may make certain processes such as de-duplication more difficult. Full data snapshots are initially not necessary and representative subsets may be used.

2. Divide the Data Management Effort 

A data management exercise is a complex effort touching on numerous systems and business units within an organization. To ease this complexity, it is often necessary to delineate the project into discrete functional areas or domain areas and approach the effort in a granular manner. This approach simplifies the process and allows isolated testing to commence early in the data management process.

 3. Robust Transaction Processing & Logging

Strong transaction processing ensures that the process of transforming, moving and creating data in new systems should not be halted because of a failure of a few records. Logging failed transactions without failing an entire run drives reliability into the process. Aligned to transaction processing is an actionable, accurate and timely logging mechanism. Failed transactions should be logged and made available to departments and staff members that are empowered to take action.

 4. Establish Quality Metrics & Unit Tests

Establishing metrics in conjunction with the business drives transparency into the processes. These metrics often become the cornerstone of decision-making in the data management effort. Problem areas are revealed earlier, and decisions can be made between mutually-exclusive objectives such as accuracy and completeness. Defining the expected quality metrics assists in defining the expected outcome and understanding the point where the process has reached an acceptable quality level. Unit tests play an important role in ensuring quality as they may reveal unintended changes in the semantics of the transformed data.

 5. Establish a Reporting Cadence

Business users that are involved in decisions related to data quality are drawn from functional areas from all across the institution. These users are tasked with ensuring their day-to-day activities are fulfilled in addition to making available time for the data management effort. The reporting cadence and triage procedure is, therefore, important. Reporting must occur at a tempo that allows the data management effort to move forward in a way that does not overwhelm the decision maker.

 6.  Volume Testing

Volume tests need to be conducted relatively early in the process. Metrics and extrapolations derived from the volume tests are used to predict the timing of live production runs. These metrics are especially important if the principle of division is followed.

Data management is a vital first step towards achieving compliance. The ability to link all legal entity data and documentation together to achieve a single, comprehensive view that is securely accessible from one location holds many advantages for the compliance and risk management teams, as well as other parts of the institution (e.g. business development /customer onboarding). From a compliance perspective, it cements the relationship between risk management and data management. Higher quality data management gives financial institutions the ability to accurately measure risk exposure and comply with an ever-growing list of regulations, all of which aim to mitigate or prevent risk in some way. By making information and data easily accessible, in the correct standardized format which is available on-demand, financial institutions can ensure they are meeting their obligations under various regulatory obligations.

If you would like to learn more about achieving an integrated, holistic approach to Client & Counterparty Data Management, download our dedicated whitepaper, Getting to Grips with Client & Counterparty Data.