Data Quality for Risk

Decouple IT and data to the benefit all

Many risk professionals find that they spend too much of their time on gathering, preparing and validating data. Whilst trust in enterprise data quality is a major factor in this misallocation of resource, it is doubly frustrating when the tools for correcting data at source can only be operated by technology experts. For a CRO, this dependency of risk staff on IT resources leads either to delays in getting the job done, or the evolution of “shadow IT” infrastructure within risk management to circumvent the issue.

Adopting a self-service approach, the Datasynthesis platform offers no-code data governance and data quality management. This frees risk professionals from dependency on IT time and resource needed to address data issues and increases the productivity of the IT expert through leaving data issues to the people who know the data best. Combining this approach with the scalability of the platform enables risk and data experts to ask questions such as “Tell me about any data issues right now” and receive enterprise-scope responses in real-time, delivering a new level of transparency for enterprise data quality.

Real-time data quality

Pro-active prevention of the spread of data issues

As mentioned previously, lack of trust in enterprise data quality is a key motivation in many efforts by risk staff to validate the data they have sourced. Many data errors are only identified in downstream risk systems and processes. Without the propagation of the corrections identified back to the data source, then at best this results in multiple data teams identifying and duplicating corrective action. At worst, data errors are corrected locally in a single risk system, leaving the other users and systems unaware of the issue. This siloed approach to data quality, often with multiple data quality tools used across multiple departments, leads to inefficiency, expense and increased operational risk at an enterprise level.

The cloud-native scalability of the Datasynthesis Platform enables a much more pro-active approach to be taken to enterprise data quality. Rather that manage data quality downstream on a local value-by-value exception basis, data quality can be managed upstream as an enterprise process. So, no waiting for batch processes to complete, data quality for the entire enterprise can be monitored in real-time on data quality dashboards and issues mitigated before data errors have spread to downstream systems and users.

Adoption of industry data standards

Better understanding of your data and reduced reconciliation effort

Whilst data locally within a specific tool might be of good quality, enterprise data quality can still be heavily compromised by poor consistency across systems, departments and geographies. Inconsistent data leads to confusion over what data means in one system when compared to another, and as a result dramatically increases the reconciliation effort needed when trying to aggregate data.

The Datasynthesis Platform uses a Common Data Model (alternatively known as a Canonical Data Model) to represent all data entities and relationships in their simplest possible form. For each operational system only one transformation is needed from Canonical form to that of the system’s local data model, allowing enterprise consistency to be maintained through this common understanding used by all systems. Combine this enterprise level understanding with industry standards such as FIBO and you have a platform for increased data collaboration and reduced reconciliation.

Structured and unstructured data

High data quality regardless of data type

Whilst structured data is still of vital and fundamental operational importance, newer sources of data are not always tabular in nature. Data quality management is important for unstructured and semi-structured data too, and this is not possible if your data quality tools are designed to only support traditional structured data.

Graph database technology enables the Datasynthesis Platform to manage complex relationships between data, regardless of data type. Enabling unstructured data to be brought into the capabilities of your data quality initiatives means you can control any type of data more efficiently and more consistently.

As at historic data versioning

Reproduce your universe of data at any point in time

Being able to reproduce the data used in reports at any point in time is a key capability for dealing with regulatory, risk and audit reporting requests. Due to architectural constraints, many data tools lack this capability, but even for those that do have it there are often significant issues. One key issue is simply the scope of the data, rules and transformations that the data tool manages is constrained to its own local world of data, rather than flows and transformations across and between downstream systems and operational data stores.

With the Datasynthesis Platform, your policies automatically define how much history is stored, and the raw data feeding in and out of all systems in your data ecosystem is available wherever and whenever you need it. As a result you can keep the regulators happy by being able to reproduce the state of data at any point in time for any part of your data architecture.

Lineage for the entire lifetime of data

Track your data flows across all your operational systems

Being able to understand the flow of your data is vital for increased efficiency and security in how data is procured, ingested and used. Due to constraints in scale and scope, many data tools can only tell you how data flows locally into the tool, how it is transformed and where the tool exports it to. You are left guessing where the data came from, and more concerningly from an operational risk point of view, where it flows to.

The Datasynthesis Platform tracks all data flows throughout the lifetime of data, enabling clear understanding of which systems are dependent on what sources, and whether data is being used in compliance with its licensing terms. Given that the platform is used to define operational data flows, all data lineage reporting is an accurate and direct reflection of

Provision with security and trust

Secure, governed access to your data

Data security is a key concern for all organisations. Regulations such as GDPR have driven many security initiatives, but in addition the sheer reputational damage of a data breach is something that is unquantifiable for many. In a data architecture of comprised of many legacy tools with complex cross-dependencies, it is unsurprising that weaknesses exist and are difficult to mitigate.

The Datasynthesis Platform provides a single set of secure services for accessing enterprise data. These services take care of authorizing users, authenticating credentials and granting access to only the data they have been permissioned to see, based on their role in the organization. Given the scalability of the platform, point-to-point and departmental tools can gradually be rationalised as you move towards a modern, simpler and more secure data architecture.