Case Study: Consolidation of information through Data Warehousing for banks

With the era of Big Data upon us, banks, fintechs, and non-bank lenders all have one thing in common: multiple systems with large amounts of data. Banks have a headstart against newer competitors as they have decades of data on consumers and a long history of industry experience. Despite this, no matter whether the financial institution is new or old, all want to use this data to gain insight into their business and make better decisions. This wealth of data is essential for reporting to authorities, analysing customer behaviours and for internal reporting (sales, stock, balance etc).
Why is data consolidation an essential part of Data Warehousing?
It is an enormous undertaking to get a full overview of the customer journey when data is stored in different formats in different locations. With data across multiple platforms, companies are likely missing out on some of the insights it may be able to provide. This makes data consolidation a must! With all data in one place, raw data can be turned into insights that drive better, faster decision-making.
Data consolidation, sometimes known as data integration, is the collection, integration, and storage of varied data in a single place. Data Warehousing is the process of constructing and using a Data Warehouse which integrates all that data into a system that supports analytical reporting and deep-dive, ad-hoc queries which are invaluable to strategic decision making.
Having a single source to store information, as provided by a Data Warehouse, streamlines resources and makes it easier to identify patterns and look for insights across multiple types of data that might otherwise not be available. Moreover, data consolidation reduces costs related to reliance on multiple databases and inefficiencies such as data duplication.
Data consolidation is only possible if data is centralised. A centralised repository such as a Data Warehouse provides a broad, integrated view of all data assets, with relevant data clustered together, allowing companies to turn numbers into insights.
Approaches to Data Warehousing
Now that we understand the importance of data consolidation, how can banks, fintechs, and lenders implement such a system easily and quickly? Many organisations are aware of the importance of Data Warehousing and will try to solve these issues with Data Warehouse-projects, Data Lake-projects or with Vendors of different BI-TOOLS. In general, these projects tend to be inflexible and take a long time to complete, with low usability after launch and unforeseen costs. We do not advise such an approach.
While it would be great to have a one-size-fits-all solution, the reality is that such a solution does not, and can not, exist. Although on the surface many financial institutions are structured in similar ways, they all have very different system setups and this makes it impossible to deliver a Data Warehouse solution as a stand-alone software that can be installed on a server. There has never been a successful Data Warehouse project without customized code, that is a hard fact.
Still, a lot of Data Warehouse code can be generated from the configuration. It is simply a matter of balance between customization and configuration along with project methodology, design patterns, and automation that makes a great Data Warehouse solution.
The Open/Closed Principle in Data Warehousing projects
As an example, the Open/Closed Principle (OCP) is a useful tool in Data Warehousing that can be generated from the configuration. The OCP is a software principle in which code is “OPEN” for extensions but “CLOSED” for modifications. This means that all changes to the code will be extensions. It protects against “breaking” changes so new releases will not affect already released code. This prevents unintended effects that can cause bugs in other parts of the system and undoing months or years of iterative improvement.
Similarly, implementing an OCP not only has benefits in code writing but also as a principle when persisting data in the Data Warehouse. The OCP is applied to the data and the data structures themselves which prevents unforeseen consequences of challenging data quality in source systems. To achieve this, the modelling technique must emphasise the separation of business keys and descriptive data. Using OCPs is a great way to create successful, sustainable data warehousing systems.
Subscribe
Get insights into the latest developments in a fast evolving industry with Näktergal’s newsletter.
Creating a successful Data Warehouse system
Näktergal Consulting has a vast amount of experience in implementing Data Warehouse projects in banks and other FI’s. From experience, a great Data Warehouse project meets these requirements:
- All data from each source is collected, nothing is altered, everything is historized.
- A new feature can be rolled out without the need of retesting already implemented functionality.
- The cost of extending the Data Warehouse is low.
- The projects are fast, up to 1/8 of the time for a Common DW project.
- The maintenance costs are low.
The experts at Näktergal Consulting each have over 10 years of experience in delivering solid Data Warehouse solutions for Banks, retailers, and institutions. We deliver these solutions for the Cloud or on-premises. Reach out to us today to find how quick and easy it is to implement a Data Warehousing solution tailored to you.
TL;DR:
Data Warehouse projects are complicated. There isn’t a simple, one-size-fits-all solution for a great Data Warehouse project. The benefits of Data Warehousing are enormous, not only does it make reporting easier, but it improves efficiency, transparency, and decision making.