Financial Services Data Models Made Actionable with Data Integration
Bridging the gap between data models and data integration
Overcoming Business Inertia
Every now and again it takes some fresh thinking to get businesses innovating and moving again. They often become bogged down by their own structures, processes, technologies and politics, not to mention regulation and governance requirements. Teams are then organized around their key operational or technical functions, shrouding the business with an inertia that inhibits new ideas, new ways of working and, importantly, growth.
The status quo can be very powerful and it can be difficult to break away from, even with the political will. The larger the organization, it is likely the bigger the internal barriers are to change. This is where the thought leader comes in, someone to tear down the walls of inhibition – or in fact they could be a maverick, dedicated to driving change and innovation by challenging others to think, behave and work differently, hardly ever an easy task.
In our line of work, we often come across such organizations and people. It makes what we do so interesting; being the facilitator that helps a company to move forward in ways that were previously unimaginable. Sometimes the technology and the way it is deployed is the problem, other times it’s the people involved, but usually a combination of both factors work together to stagnate a business.
We get to see it all, and are always privileged to be asked to help clients in these situations. Such work is complex and challenging in equal measure, both from an engineering and political point of view. However, where the political will for change is strong, with the key leadership executives on board, then the possibility for innovation becomes reality.White Paper: Data Modeling and Data Integration
We have been involved in a recent customer project where the client was trying to overcome several of these issues. Thanks to some innovative thinking from management, as well as some innovative uses of technology, they were able to achieve some significant business benefits.
You can read more detail in the white paper, but here’s the background….
Restrictive Business Practices
A large financial services organization came to us with a very specific need. They have various regulatory and governance rules to comply with, which means there are many internal requirements guiding their development and data processing pipelines. To help ensure compliance with various rules, it is necessary for them to be able to model their data processing pipeline to allow non-technical users to participate in audits and feature development. Maintaining data models and runnable code in multiple tools with only weak links between them (in the form of documentation or just institutional knowledge) introduces significant overhead and delay.
Each change in the data model needs to be verified by the data modelling team, and then sent for implementation by another team. The changes must be analysed before being implemented in the technology of choice. Additional testing is then required, so that it matches the model and no new bugs are being introduced. You can see how quickly teams can find themselves tied up in knots.
We were called into action by a visionary executive, we’ll call him Mr. X, someone who recognized the pain experienced by different teams and that important projects were getting bogged down by lengthy processes. His vision was to innovate and improve these restrictive practices by bridging data models with data integration techniques.
The Big Idea – Bridging Data Models and Data Integration
Generating runnable code directly from data models is a significant improvement on the above process. It removes various obstacles and inefficiencies from the implementation process – fewer resources are needed, the time needed is shortened and, most importantly, the generated code is guaranteed to match the model. This can significantly improve an organization’s ability to prove that various regulations are followed. It reduces the effort needed for various auditing and data lineage exercises.
Turn Data Models into ETL Jobs at the Click of a Button
Modeling tools are good at modeling, no surprises there, but provide bad execution environments. It’s easy to see data relationships and transformations but there is usually a weak ability to execute the models themselves. Further, there is no automation and monitoring and no ability to consume data from queues, files, remote locations, web services and so on. By bridging between the data modeling environment and CloverDX data integration, our client is now able to provide the right kind of execution environment to make the data models actionable. In this way, they can execute transformations, processing data from many different sources, while having access to the monitoring tools required for production support.
Organizational Change Through Technical Innovation
The white paper, and case study detailed within it, describes the implementation of what we call the Data Modeling Bridge, the solution that was implemented and deployed to production.
The key non-technical benefit of the approach was better collaboration between business and technical teams, who are now using a common language and better processes to improve communication and streamline workflows.
The key technical benefits of the approach were significantly faster time between project start and deployment to production (shortened from 9-12 months to 2-3 months), and higher test automation with better test coverage.Webinar: Turn Data Models into ETL Jobs at the Click of a Button